After you have connected your BW system to an external database using DB connect, you can make the DataSource delta capable with ABAP code using the time-stamp technique.
Key Concept
Most databases have a method for identifying new records and changes to the database whereby each new and changed record carries details of the time in year/month/day/hours/ minute/seconds. These details are also known as a time stamp. Oracle databases have a time stamp for all their tables unless they are specifically created otherwise. Using the time-stamp technique, delta records can be identified using the time-stamp details.
SAP BW from Release 3.0 on allows you to extract data from an external database using DB connect. However, delta capability is not automatically available. In part 1 of this series, we explained the first two steps for making an external DataSource delta capable: how to connect to the external database and how to create and configure the DataSource on the external system. In step 3, which we describe in this article, you make the DataSource delta capable with custom ABAP code.
We use the time-stamp technique, in which the delta records are identified using the date and time of creation or change captured by an application on the external system. Technically not a true delta, this method gives the user the delta records. In a true delta, the system tracks records that are already extracted. An overview of the coding and logic we used follows:
Step 1. Find the time of the last successful extraction. You limit the extraction to the records created/changed after the last successful extraction. Using coding, you identify the last successful extraction date/time. The delta mechanism works by comparing the time stamp of the InfoPackage run with all time stamps of the records. It then chooses those records whose time stamps are greater than or equal to the InfoPackage time stamp.
Step 2. Calculate the safety delta time to arrive at the extraction parameters of the time stamp in the InfoPackage.
Step 3. Convert the time details, which are in Universal Time Converted (UTC) format, to the time format of the user.
Before getting to details of these three steps, let’s look at the “view definition” of the database table. You need to make some changes while creating the view. Figure 1 shows the code for creating a view based on an Oracle table.
CREATE VIEW SMARTBI.INCOMING_VIEW (TRANSACTION_ID,RAW_TXN_ID,TXN_TYPE,DATE_OF_TXN,TIME_OF_TXN, CURRENCY_ID, TXN_AMT, SETTLEMENT_ID, MOD_DATE, MOD_TIME, MERCHANT_ID, TERMINAL_ID, BATCH_ID,ROC, CARD_ID) As SELECT TRANSACTION_ID, RAW_TRANSACTION_ID as RAW_TXN_ID, TRANSACTION_TYPE as TXN_TYPE , to_char(DATE_OF_TRANSACTION,'YYYYMMDD') as DATE_OF_TXN, to_char(DATE_OF_TRANSACTION,'HH24MIss') as TIME_OF_TXN, CURRENCY_ID , TRANSACTION_AMOUNT as txn_amt, SETTLEMENT_ID, to_char(MODIFICATION_DATE,'YYYYMMDD') as MOD_DATE, to_char(MODIFICATION _DATE,'HH24MIss') as MOD_TIME, MERCHANT_ID , TERMINAL_ID , BATCH_ID ,ROC , CARD_ID FROM smartdba.incoming_transactions
|
| Figure 1 |
Create a view based on a table in Oracle |
With the code in Figure 1, we created a view for the table that stores the source data that we want to extract. We did that because the table’s name INCOMING_TRANSACTIONS is longer than 16 characters. The view INCOMING_VIEW overcomes the limitation. Note that these tables and views are not standard SAP tables. Because you are connecting BW to an external application, the tables/views on the external application are developed by the designer of external application. If the table name/field name does not fall within one of the limitations, then you do not need to generate a view.
The application data in our example is stored in a schema called smartdba. incoming_transactions is the table in the Oracle database from which we will extract the data.
Note in Figure 1 that the fields with names that exceed 16 characters — for example, RAW_TRANSACTION_ID — are referred to as RAW_TXN_ID. The date format is changed to YYYYMMDD and the time is in HHMMSS with 24-hour time settings.
Step 1. Identify the Last Successful Extraction
Whenever you execute an InfoPackage to extract data, the details are stored in system tables in SAP BW. This includes details such as the name of the InfoPackage, the request number generated by the system, the date and time when the InfoPackage was executed, the Quality Management (QM) status of this request, and the number of records extracted. One of the useful database tables for our requirement is RSREQDONE, which gives the information required to make the DataSource delta capable. All of the above field values are stored whenever any InfoPackage is executed to extract the data from the source system regardless of the source system.
To implement the process, create an InfoPackage to extract all existing records. This brings all the records from the source system to BW. This InfoPackage is executed without any selection criteria. The details are stored in table RSREQDONE in BW. Make sure to enter the InfoPackage entry into table ZSC_DELTA_IP with the flag as I for initial upload.
Let’s say the InfoPackage for the initial load is ZPAK_3U94F3YC8LY7XAG26PWU2OCQ6. The system generates the technical name of the InfoPackage. We executed this InfoPackage on 06.11.2004 at 10:11:32. All the details of this InfoPackage are stored in the table RSREQDONE along with the QMSTATUS and number of records extracted.
When you extract the delta data from this DataSource, you want only those records that are newly created or changed after the last successful extraction. To get the last successful extraction date and time, query on table RSREQDONE using the technical name of the InfoPackage. You may find many records for the same InfoPackage depending on the number of times you have executed it. Therefore, you need to put those entries in the system memory and find the last successful run. At this point we expect only one entry since we executed the initial upload InfoPackage only once.
To put the entries into the memory, you create internal tables by writing a small piece of ABAP code. BW has a facility at the InfoPackage level to write an ABAP routine and create a dynamic selection criteria (Figure 2). This ABAP code fetches the date and time of the last successful extraction, which are passed as selection criterion for the current extraction.

Figure 2
InfoPackage selection screen
We recommend a separate InfoPackage to differentiate between the initial upload and subsequent delta uploads. Table RSREQDONE stores the data for the InfoPackage that was executed. That table retrieves the date and time for the last successful extraction. You create another InfoPackage for subsequent deltas. To differentiate the first delta from the following ones, you need to write some custom code. Create a Z table. We called it ZSC_DELTA_IP. It contains four fields (Figure 3):
- ZZ_FIRST_DELTAIP stores the technical name of the InfoPackage used to run the initial upload
- ZZ_SUBSQ_DELTAIP stores the technical name of the InfoPackage used to run delta uploads
- ZZ_FIRST_DELTA indicates by a “yes” or “no” whether the current run is the first delta or a subsequent upload or initial upload
- ZZ_INFOSOURCE indicates the name of the InfoSource to which this DataSource is connected

Figure 3
Structure of table ZSC_DELTA_IP
Since the initial upload does not have an ABAP routine attached to it, it goes to the source system and extracts all the records in the system for that particular DataSource. In the InfoPackage used for the delta upload, first evaluate through the custom ABAP code whether this is the first or a subsequent upload. The one-time activity to support this identification of the initial request is that you enter the technical name of the InfoPackages (Initial Upload and Delta Upload) into this table with a “Y” flag and the name of the InfoSource. The name of the InfoSource does not have any other technical significance in creating this delta technique, but it helps the end user to recognize the pair of InfoPackages since the technical names are generated by the system.
To give an example, the name of the initial InfoPackage is ZPAK_ 3U94F3YC8LY7XAG26PWU2OCQ6 and the name of the subsequent delta InfoPackage is ZPAK_U67PT8H4PCBV9KMZWV94GNZG. You need to create an entry in table ZSC_DELTA_IP with these details:
*ZZ_FIRST_DELTAIP =
ZPAK_3U94F3YC8LY7XAG2 6PWU2OCQ6
ZZ_SUBSQ_DELTAIP =
ZPAK_3U67PT8H4PCBV9KM ZWV94GNZG
| ZZ_FIRST_DELTA |
= I for initial upload |
| |
= Y for first delta upload |
| |
= N for subsequent delta upload |
Once you create the above entry, the ABAP code below helps to determine the InfoPackage name, which then helps you get the time-stamp entry from table RSREQDONE. The code below is used in the InfoPackage ZPAK_3U67PT8H4PCBV9KMZWV94GNZG, which extracts the delta upload.
One limitation is that you are unable to use a variable to get the name of the InfoPackage in the ABAP code. Therefore, this is the only hardcoded area of our technique.
In the code in Figure 4, we used a custom-built function module. FM Z_BW_SC_GET_ INFOPACKAGE determines which InfoPackage is used to get the time stamp. The extracted time stamp is used to identify the delta records from the source system.
FUNCTION Z_BW_SC_GET_INFOPACKAGE. tables : ZSC_DELTA_IP. select single * from ZSC_DELTA_IP where ZZ_SUBSQ_DELTAIP = ZZ_IP_INFOPACKAGE. if sy-subrc = 0. if ZSC_DELTA_IP-ZZ_FIRST_DELTA = 'Y'. " First Delta Load ZZ_OP_INFOPACKAGE = ZSC_DELTA_IP-ZZ_FIRST_DELTAIP. ZZ_FIRST_DELTA = ZSC_DELTA_IP-ZZ_FIRST_DELTA. elseif ZSC_DELTA_IP-ZZ_FIRST_DELTA = 'N'. " Subsequent Delta Load ZZ_OP_INFOPACKAGE = ZSC_DELTA_IP-ZZ_SUBSQ_DELTAIP. ZZ_FIRST_DELTA = ZSC_DELTA_IP-ZZ_FIRST_DELTA. elseif ZSC_DELTA_IP-ZZ_FIRST_DELTA = 'I'. " Initial Load ZZ_FIRST_DELTA = ZSC_DELTA_IP-ZZ_FIRST_DELTA. endif. endif. ENDFUNCTION.
|
| Figure 4 |
ABAP code of the function module Z_BW_SC_GET_INFOPACKAGE |
The input passed to this function module is the technical name of the InfoPackage used to extract the delta upload. Using the entries in table ZSC_DELTA_IP and the flag ZZ_FIRST_DELTA, it determines which InfoPackage is used to get the time-stamp information from the table RSREQDONE.
Once you determine the name of the InfoPackage, get the time stamp of the last successful request. To retrieve the time stamp, query on the table RSREQDONE using the technical name of the InfoPackage. Use two more conditions. Records retrieved should be more than 0 (RECORDS > 0) and the technical status of the request should be GREEN (TSTATUS = ‘@08@’ or TSTATUS = ‘@08’). Since you may have more than one record retrieved from the above query, put them in an internal table using the ABAP code below.
You select the latest request from the above records. To do that, sort the internal table IT_RSREQDONE by the field TIMESTAMPBEGIN in descending order. The latest record will be the first record in the internal table. Pick up the values stored in the first record, particularly DATUM , UZEIT, and TIMESTAMPBEGIN.
The fields DATUM and UZEIT store the date and time respectively when the extraction started. The field TIMESTAMPBEGIN stores the same information but in a different format. We stored the values in the local variables WA_DATE_RSREQDONE, WA_TIME_RSREQDONE, and WA_RSEQDONE_TS, respectively.
Step 2. Calculate the Safety Delta
While the extraction is being carried out, documents could be created in the system. These are likely to be missed in the extraction unless you use a safety delta. We use it a little differently than SAP does. Our objective is not to ensure we don’t miss out on any records, but to extract specific delta records.
Take the example of a record with a time stamp of 10:00:01 hours that was committed to the database at 10:00:05, a small delay of 00:00:04 hours. Without a safety delta, this record could be lost in extraction. The safety delta time and the decision on the amount of time may differ from application to application. The larger the safety delta, the greater the chances of having duplicate records extracted. With a safety delta, you extract records that carry a time stamp lesser than that of the last successful extraction. An overlap of records in the two successive extractions occurs, and the longer the duration of safety delta, the more duplicate records are extracted.
That is why we recommend using an ODS object in the data staging process to eliminate the duplicate records. Normally after extraction the records could be directly uploaded to an InfoCube for multidimensional analysis, but in this case, we recommend that the records first be loaded to an ODS. The ODS subsequently loads the records to an InfoCube. For more detail on employing an ODS to eliminate duplicate records, see our article, “Design Better Custom Extractors for ODS Delta Loads.”
Now we’ll show you how to retrieve the value of the safety delta. To provide flexibility in the process, we created one more Z table — ZSC_SAFETY_DELTA — with only one field (Figure 5).

Figure 5
Z table to designate the default safety delta
This table, which has just one record with the value in hours, is maintained manually. The value in hours is subtracted from the time stamp. The time stamp we are using is the date and time when the extraction started from the source system side. The value is extracted using ABAP code (used in the second function module). If you want to change the value of the safety delta, you do not need to change the ABAP code. If you want to go below one hour or above 24 hours, you can use the Z table ZSC_SET_EXT_TIME (Figure 6).

Figure 6
Z table for less than one hour or more than 24 hours SAFETY DELTA
If you want to use the safety delta in hours, then the table in Figure 6 should be kept empty. To use the safety delta in seconds or minutes or days, put the appropriate entry in the custom table. The custom module accounts for the rest of the calculation. If table ZSC_SET_EXT_TIME has any record, then the safety delta in hours stored in the table ZSC_SAFETY_DELTA is not used. Table 1 shows how the entries are stored. Depending on the value of the safety delta, you maintain the table ZSC_SAFETY_DELTA or table ZSC_SET_EXT_TIME.
| |
ZZ_TIME_FLAG |
ZZ_HOWMUCH |
| 30 seconds |
S |
30 |
| 1 minute |
M |
1 |
| 2 days |
D |
2 |
|
| Table 1 |
What to fill in the table ZSC_SET_EXT_TIME |
Note
All InfoPackages, regardless of the DataSource/source system executed, create a record in the table RSREQDONE. While getting the delta, our aim is to get the delta or time stamp for a specific DataSource/source system combination. Each DataSource has multiple entries in the table RSREQDONE depending on the number of times they were executed in the system. To get the correct time stamp, use INFOPACKAGE ID (as it is the only information available to begin with) to filter the records from the entire record set in the table RSREQDONE. Using this subset, you get the MAX of the time stamp using sort in descending order. Take the first entry of that sorted list.
Note
For the initial upload when you execute the InfoPackage, make sure that the value of the field ZZ_FIRST_DELTA is I. For the first delta extraction, make sure that the value of the field ZZ_FIRST_DELTA is Y for this InfoPackage entry. Once you extract the delta the first time, make sure to change this entry to N. This exercise is required only for the first delta after the full upload initial request. This procedure ensures that the program eliminates the entry for the initial upload InfoPackage while evaluating the InfoPackages for the deltas.
Step 3. Convert the Time Details
Now that you have the time stamp of the last successful request (using step 1) and the value of the safety delta (using step 2), you need to subtract the safety delta value from the time stamp to start the next extraction.
The time the system displays in the SAPGUI (as shown in the monitor) and the time in the database table are different. The database table RSREQDONE stores information in UTC format. That time is converted to local time when displayed in the SAPGUI.
When you calculate the time stamp using the safety delta, it does not give the correct time stamp for the next retrieval. To get the correct time you need to convert the UTC time into your local time using the ABAP statement shown below.
The system variable SY-ZONLO allows the statement to convert the UTC TIMESTAMP into your local time. The above statement also splits the TIMESTAMP into two different local variables — WA_TS_DATE and WA_TS_TIME. You pass the date and time separately since the external application stores the values in two different fields.
Our safety delta in the normal case is stored in hours and the last extracted TIMESTAMP is available in the form of date and time. You need to bring these two values in the same unit. We prefer to convert both these values in seconds to complete the calculation, as shown below.
Here, the wa_default_safety_delta refers to the safety delta value picked up from table ZSC_SAFETY_DELTA. The statement converts the default safety delta time in seconds. The code below shows the ABAP code to convert date and time (result of ABAP code above) to time in seconds.
If the value of safety delta is greater than the time retrieved from the last request, you need to subtract a day with the statement shown below.
if wa_ts_time_in_seconds > wa_safety_delta_in_seconds.
wa_ts_time = wa_ts_time — wa_safety_delta_in_seconds.
wa_ts_date = wa_ts_date.
else.
*** 24 Hours * 60 Minutes * 60 Seconds = 86400 seconds
wa_ts_time =
86400 — ( wa_safety_delta_in_seconds -
wa_ts_time_in_seconds ).
wa_ts_date = wa_ts_date — 1.
endif.
The calculations provide the time-stamp value required in the InfoPackage selection criteria MOD_TIME and MOD_DATE (parameter for extracting delta records as shown in Figure 2). However, we are providing the value of MOD_DATE as the low range (beginning limit for the date) and 99991231 as the high range (last limit for the date). This is because the records having the time stamp from 00:00:00 hours of the beginning date to the time derived from the safety delta will also be extracted. They will be removed in the start routine before you upload to the data target.
Shreekant W. Shiralkar
Shreekant W. Shiralkar is a senior management professional with experience on leading and managing business functions as well as technology consulting. He has authored best selling books and published many white papers on technology. He also holds patents for innovations. Presently he is global head of the SAP Analytics Centre of Excellence at Tata Consultancy.
You may contact the author at s-shiralkar@yahoo.com.
If you have comments about this article or publication, or would like to submit an article idea, please contact the editor.
Bharat Patel
Bharat Patel is experienced in managing data warehouse technology for the petroleum industry. He is an SAP-certified BW and ABAP consultant, has authored a book on SAP BW, and teaches courses on BW and ABAP at the Sapient Academy and SAP Labs India. Bharat has presented papers about BW at Business Intelligence India Group (BIIG) conferences. He presently manages the SAP BW system at Bharat Petroleum, India.
You may contact the author at patelb@bharatpetroleum.com.
If you have comments about this article or publication, or would like to submit an article idea, please contact the editor.