Keeping the data in your InfoProviders current is critical if you want to give end users reports with any real value. Automation is the key to efficiently performing updates that keeps the data fresh. Here’s a way to automate even the toughest updates using process chains and a little custom code.
Key Concept
Automating transactional and master data loads as well as other jobs in BW 3.x can be streamlined using process chains. They group processes into sequences such that each process in the chain waits for its predecessor to be completed.
You have two basic methods to update data in your InfoProviders. You can delete all the data and reload it again, or add data incrementally via a delta update. Sometimes, however, neither of these methods is adequate. Because of the way the system maintains records, a delta update could result in multiple errors, and starting over from scratch puts such an onerous overhead load on your resources that it’s just not feasible.
Consider the following example. When certain financial activities are performed in R/3, the system stores the results in table ECMCT in the Consolidation ledger. Records are uploaded from there into the Business Consolidation (BSC) InfoCube in BW. If any mistakes are made during the consolidation process, the incorrect data is passed on to BW. Correcting the error in R/3 causes a problem for BW because rather than reverse the value of the mistaken item and bring it to zero in table ECMCT, the source system deletes the bogus record altogether. Updating the record in BW via a delta is not possible because the program no longer “sees” the record in R/3.
As a result, erroneous records are retained in the BCS InfoCube instead of being updated with the correct values. Over time, these types of bad records stack up and it becomes necessary to delete all the data in the InfoCube and reload it again from scratch. Doing a complete reload is at best an inelegant solution and often impossible for certain applications.
I encountered just this type of problem recently. My client required an InfoCube to be updated hourly with lots of data. That was too frequent to allow reloading from scratch, but I knew that InfoCube was being loaded with inaccurate records that were later deleted from the R/3 sources. The bad information was the being disseminated in reports. I developed a way to split the upload into separate historical and current layers with a process chain. I also had to add a little ABAP code to the mix to make sure the job was done correctly.
The solution I’ll describe is not restricted to BSC InfoCube updates. It can be used as an alternative for any upload method regardless of the InfoCube being refreshed or the reports being generated. The core of this solution is designed to accommodate those situations where InfoCubes need to be updated frequently but it is not possible to use a normal delta upload.
What’s a Process Chain?
Let’s take a quick look at process chains before moving on to my solution. With the release of BW 3.x, process chains made it easier to set up and maintain the batch automation of transactional and master data loads as well as other task including Reporting Agent jobs and index and statistic rebuilds. They are created in Administrator Workbench or via transaction RSPC and support job scheduling and monitoring within BW.
A process is the object created when a developer selects a process type and configures a specific variant. Process chains group jobs — or processes — together in a sequence determined by the interdependencies between each process. When activated, each process in a process chain is assigned a system-generated event and waits for its predecessor to be completed.
A process chain may have any number of process variants, which contain the parameters for running a process in a particular form. SAP ships predefined process variants with BW that you can use out of the box to develop process chains. If these variants are not enough, you can also customize your own process variants. The process variants provided by SAP cover many jobs, allowing you to run everything from data loads to ABAP programs.
Note that each process has a defined beginning and end point. You cannot reuse the same process twice in a chain, but you can create any number of variations of each process type. For more details about process chains, see “12 Tips to Automate BW Using Process Chains.”
Updating with Process Chains
To update the data in my client’s InfoCube, I created two process chains, one to load the historical data and one to refresh the InfoCube with current data. The process chains alone were not enough, however, to perform the update correctly. In fact, they created a potentially disastrous situation, which makes duplicate data available to end users. To correct this and make the update process completely automatic, I also wrote ABAP code for the variables in the BEx Query Designer, which I’ll discuss later.
The first step in my solution is to build two InfoPackages for the same InfoCube: one for the historical data upload and the other for the current data layer. Next, using the process chain utility, which is accessed either with the transaction code RSPC or via the chain link icon in Administrator Workbench, create separate process chains for the historical and current data requests. Figure 1 shows the process chain that initiates and runs the InfoPackage for the current data loads.

Figure 1
Use the process chain utility to automate your data loads
The historical data upload is performed first. At strategic times such as when the current data layer becomes too large to be maintained efficiently, the historic data may need to be reloaded. All data must be deleted from InfoCube before the historical data is reloaded. The process chain to automate this load is simple. It consists of two steps: The first deletes the data from the InfoCube and the second uploads the InfoPackage responsible for loading the historical data layer.
The process chain responsible for uploading the current data requests does not delete all the data from the InfoCube. Instead, it uses the standard process type Delete Overlapping Requests (Figure 2) that ships with BW. Process types determine which tasks the process performs and which properties it maintains. The process variant is defined along with the with the process type. In this example, DELETE_OVERLAPS is entered in the Variant field. This process is critical to the chain because current data is being refreshed hourly so the amount of redundant data in the InfoCube would become very large very fast and most reports would be bad.

Figure 2
The Delete Overlapping Requests process maintenance screen with DELETE_OVERLAPS entered in the Variant field
The Delete Overlapping Requests process compares requests uploaded into the InfoCube and deletes those with the same or overlapping upload parameters. In this case it identifies the InfoPackages used to load the current data because they have the same selection criteria. Running this process in the InfoCube eliminates any data overlaps. Without it, users would consistently receive reports with erroneous information due to redundant data. For example, a report might show a $200 value when the correct value is $100 because the same data has been uploaded twice. With the Delete Overlapping Requests process in the chain, it can track down and delete data in the previous upload.
Links in the Chain
You can monitor the historical and current data loads in Administrator Workbench by selecting InfoProviders, locating and right-clicking on the appropriate InfoCube, and choosing Manage. The data load status is displayed on the Requests tab. The historical data remains static but two lines for current data request are more dynamic.
When the process chain runs for current data uploads, it performs five steps:
- Initiates the request for the current data.
- Processes the InfoPackage responsible for the current data request, and loads the current data. New data is loaded into the InfoCube (Figure 3) at this point, along with the same data delivered in the previous current data request.
- Runs the standard Delete Overlapping Requests process. Comparing the selection criteria of the requests, this process determines that there is an overlap between the newly run current data request and one previously run, which has already been uploaded (Figure 4).
- Deletes the data from the older current data request that overlaps with the most recent current data request after determining that the selection criteria for both current data requests are the same and are both uploaded in the same InfoPackage (Figure 5).
- Loads the scrubbed current data into the InfoCube at the end of the process chain.

Figure 3
The process chain refreshes the earlier current request with a new request

Figure 4
For 5 to 10 minutes, both current data requests are available for reports

Figure 5
The process chain deletes the older current data request
While the process chains result in the InfoCube being updated with the two requests, they create a huge problem! As you can see in Figure 4, there is a period of between five and 10 minutes before the process chain deletes the data from the previous request for current data and during that period data from both requests resides in the InfoCube.
The enormity of the problem is clear: For a significant period of time, the current data is doubled and that incorrect data will be delivered to your end users’ reports. To make matters worse, because the problem is in the program and not a technical issue with the system, no error message is displayed to warn end users.
ABAP to the Rescue
I wrote ABAP code so that the BEx reports only look at the correct requests — e.g., the historical data and the data from the most recently run current request. With my method, the reports ignore the data from the older current data request. This solution is running in my client’s production environment and is transparent to the end users.
In BEx Query Designer, I used the variables editor, which is accessed by clicking on the pencil icon, to define two variables based on the Data Package characteristic. Figure 6 shows the screen for ZERLREQ.

Figure 6
In BEx Query Designer, create two variables in the variables editor
I also created ZLATREQ, which appears in the Filter area. The Processing by field is defined with Customer Exit, indicating that these variables are processed in user exits. User exits are where custom programs like the ABAP code in Figure 7 are added.
*&---------------------------------------------------------------------* *& Include ZXRSRU01 * *&---------------------------------------------------------------------* tables: /BI0/SREQUID, /BI0/D0ECCS_C01P, RSREQDONE. DATA: L_S_RANGE TYPE RSR_S_RANGESID, wa_i_t_var_range like line of i_T_VAR_RANGE, it_/BI0/D0ECCS_C01P like /BI0/D0ECCS_C01P occurs 0 with header line. case I_VNAM. *----------------------------------------------------------------------* when 'ZERLREQ'. if i_step = '2'. read table i_t_var_range into wa_i_t_var_range with key vnam = 'ZERLREQ'. select * from /BI0/D0ECCS_C01P into table it_/BI0/D0ECCS_C01P where SID_0REQUID <> 0. CLEAR: L_S_RANGE, RSREQDONE. sort it_/BI0/D0ECCS_C01P by SID_0REQUID ascending. loop at it_/BI0/D0ECCS_C01P. clear: /BI0/SREQUID. select single requid from /BI0/SREQUID into /BI0/SREQUID-requid where sid = it_/BI0/D0ECCS_C01P-SID_0REQUID. select single tstatus from RSREQDONE into (RSREQDONE-tstatus) where rnr = /BI0/SREQUID-requid. if RSREQDONE-tstatus = '@08@' or RSREQDONE-tstatus = '@08'. exit. endif. endloop. L_S_RANGE-LOW = /BI0/SREQUID-requid. L_S_RANGE-SIGN = 'I'. L_S_RANGE-OPT = 'EQ'. APPEND L_S_RANGE TO E_T_RANGE. endif. when 'ZLATREQ'. if i_step = '2'. read table i_t_var_range into wa_i_t_var_range with key vnam = 'ZLATREQ'. select * from /BI0/D0ECCS_C01P into table it_/BI0/D0ECCS_C01P where SID_0REQUID <> 0. CLEAR: L_S_RANGE, /BI0/SREQUID, RSREQDONE. sort it_/BI0/D0ECCS_C01P by SID_0REQUID descending. loop at it_/BI0/D0ECCS_C01P. clear: /BI0/SREQUID. select single requid from /BI0/SREQUID into /BI0/SREQUID-requid where sid = it_/BI0/D0ECCS_C01P-SID_0REQUID. select single tstatus from RSREQDONE into (RSREQDONE-tstatus) where rnr = /BI0/SREQUID-requid. if RSREQDONE-tstatus = '@08@' or RSREQDONE-tstatus = '@08'. exit. endif. endloop. L_S_RANGE-LOW = /BI0/SREQUID-requid. " Fil L_S_RANGE-SIGN = 'I'. L_S_RANGE-OPT = 'EQ'. APPEND L_S_RANGE TO E_T_RANGE. endif. endcase.
|
| Figure 7 |
ABAP code to find the latest current request and the historical data loaded into the InfoCube |
Tip!
Do not deploy the process chain described in this section to refresh current data without implementing the ABAP code in Figure 7. The data in your reports is likely to be incorrect.
In the Filter area of BEx Query Designer, use the variables ZERLREQ and ZLATREQ to establish that the most recent current request and the historical request are filtered for the InfoCube. These custom variables filter the query while it is running. The previously run current requests are ignored.
The Program
All BW requests loaded into InfoProviders receive a unique system ID that is never repeated. The ID number for each new request is sequential and increases by one with every new request (new ID number = 1 + last ID number). The two variables in the custom code filters the InfoCube for the lowest and highest request ID numbers and ignore the other requests.
The ABAP code in Figure 7 was written for the custom BEx variables and searches the requests for those with the smallest and largest ID numbers. Variable ZLATREQ identifies the latest current request and ZERLREQ spots the earliest request, which is the historical data request that has already been uploaded to the InfoCube.
Enter transaction code CMOD to maintain the user exit and create a component by clicking on the Component button. In the next screen, select RSR00001, which is the enhancement for global variables in reporting. Double-click on function exit EXIT_SAPLRS_001 to access the standard function code via the Function Builder. Locate and double-click on INCLUDE ZXRSRU01 at the end of the code (Figure 8) to enter the custom code in Figure 7. Remember that this code is available at the BW Expert Web site.

Figure 8
Locate and double-click on INCLUDE ZXRSRU01 to add custom ABAP code
I should point out that with the exception of the specific InfoProvider, which can be customized to your application, this ABAP program is generic. Table /BI0/SREQUID contains all request ID numbers generated by a BW system. Table RSREQDONE monitors the status request runs. It is used to indicate which stage the upload process is at, e.g., finished, in process, or failed via the familiar green, yellow, and red light icons (respectively). The data package table /BI0/D0ECCS_C01P is specific to the 0ECCS_C01 InfoCube.
Iliya Ruvinsky
Iliya Ruvinsky is a managing partner at Skywind Consulting Ltd., Israel. He is an SAP-certified BW consultant and instructor with more than 12 years of experience working with SAP BW and SAP BusinessObjects. He is an implementation and project management expert, serving for more than eight years as a trusted advisor to a wide range of Israeli enterprises, including in the insurance, energy, sales, and logistics industries. He is a graduate of the University of Tel Aviv, Israel, holding an MBA in information systems analysis.
You may contact the author at iliya.r@skywind.co.il.
If you have comments about this article or publication, or would like to submit an article idea, please contact the editor.