Manual data loads, such as daily and monthly refreshes, can impede your BW team and increase the risk for human error. Fortunately, data load automation is an attainable goal. The author demonstrates how to automate your load processes using examples, InfoPackages, and custom code.
Many BW teams overlook their systems’ ability to completely automate load processes. Instead of automatically performing daily and monthly refreshes and other data loads, BW is often implemented in a manner heavily reliant on manual intervention, which makes it prone to all sorts of problems resulting from human error.
I have read some terrific tips on automating specific load steps, but it has not offered much insight into completely automating load processes. This is unfortunate, because a lack of automation has the potential to greatly hamper the availability and reliability of your overall system.
With the experience of one who has worked with BW since its infancy (1.2B) through 2.0B, and who is currently working with the BW 3.2 Business Content, I will detail the decisions you need to make prior to attempting fully automated load processing. I will offer you tips for success, and review some of the core pieces of functionality found in BW. Then I will walk you through how my team used BW features (augmented with custom coding) that allow us to achieve near- complete automation.
Key Design Principles
The BW environment where I work has matured over the last four years. It now supports 500 users across the globe. Out of necessity, we have developed a list of fundamental load-process principles. My team uses these principles when maintaining the system so it is robust and easily supportable as well as scalable to sustain future growth.
These key BW design principles form the backbone of the load processes established at the firm and allow the system to perform the following functions:
- Fully automated daily and month-end loads that require no intervention from the development and support teams with few exceptions, such as those occasional manual loads required to accommodate changes made by the business outside of agreed time frames
- System operations using different load processes for a single InfoCube on different workdays
- Load processes that send automated messages to the development team when an error has occurred or when tolerance times are exceeded, as well as messages to both the development team and user base detailing the load status along with new data availability
- Automatic user lock out during the UK and US work days when master data is updated and aggregates are built
Don’t Reinvent the Wheel
I encourage you to create a similar list for the processes and features that you require from your system. After developing that list for building your BW load process, the big question is, how do you fully automate the load process? Although each team will answer this question differently, there are two basic but important tenets to embrace:
- Wherever possible, use standard functionality. BW provides robust tools on which to build a sustainable load process.
- Think creatively about how to employ the standard feature set with homegrown code to achieve full automation.
To optimize your system’s performance, your goal should be nothing less than end-to-end, seamless automation. Automating the load process allows you to concentrate on the important tasks of meeting user requirements and reporting quality data.
BW provides good tools to get you up and running including batch jobs, InfoPackage groups, event chains, and event triggers. Fully employing such standard functionality wherever possible is a great way to begin automating your system, and combining these tools with custom code brings you close to full automation.
Batch Jobs and Factory Calendars
Batch jobs can serially perform ABAP programs, external commands, or external programs. Essentially, batch jobs can be used to control all of your processing. They can be scheduled to run at a specified date and time, scheduled in accord with a factory calendar, or run at completion of a job or after an event.
The ability of the system to run batch jobs is key to its functionality. Where I work, scheduled batch jobs are used to initiate each of the daily refresh processes for all production InfoCubes. ABAP programs within batch jobs initiate other processes on both R/3 and BW.
To create a batch job, use transaction code SM36 or follow the menu path System>Service>Jobs>Define Job.
On the screen shown in Figure 1, enter a Job name and set a high, medium, or low priority for the batch job in the Job Class field. Click on the Step button and enter the relevant programs and commands for the batch job using the Step List Overview interface (Figure 2). Click on the save icon and you will be returned to the initial screen.

Figure 1
Batch jobs are key to moving your BW system toward more full automation

Figure 2
The Step List Overview screen allows you to enter the relevant programs and commands for a batch job
Clicking on Start Condition presents you with a number of different options to initiate a batch job, including the following:
- Immediate schedules a batch job for immediate processing.
- Date/Time schedules a batch job to run at a specified date and time.
- After job allows users to enter the name of a job that must be finished prior to the next being processed.
- After event sets a job to be processed after an event is completed. (Events are described in greater detail in the next section. In this case, an event is a signal stating that a predefined status in the system has been reached — for example, after the completion of an InfoPackage.)
- At operation mode is used to initiate a batch process when there is a change in operation mode.
- Workday/time (Factory calendar) is an option (Figure 3) that supports batch job scheduling for certain workdays and holidays in a particular country, and allows them to be scheduled on an ongoing basis taking into account weekends and public holidays.

Figure 3
The Workday/time (Factory calendar) option allows jobs to be scheduled, taking into account workdays and holidays in various locales
The Workday/time option provides a feature set that makes it particularly accommodating, The Workday field, for example, is used in conjunction with the Workday relative to options to schedule a job on a particular workday relative to the beginning or end of a month. The settings depicted in Figure 3 schedule a batch job being on workday minus six relative to the month’s end. The Time, Do not execute before, and Period fields allow for further refinement for initiating a batch job.
At my company, factory calendars are used to initiate month-end processing. Finance InfoCube refreshes, for example, are scheduled to start on workday -6 and continue to workday +10.
Events and Event Triggers
Events and event triggers are some of the most underrated pieces of functionality within the BW environment. An event is a system command that can be used to start a process such as a batch job; form a part of an event processing chain; or, as you will see later, flag the end of a process by employing the subsequent processing functionality within an InfoPackage. Events bridge different parts of a single process or different processing streams. They are a vital part of automated load processes, especially when used in concert with an event processing chain.
At our firm, we have taken event processing functionality one step further by building a custom program to trigger an event from within a batch job. Event triggers are another standard feature that I will describe in more detail later.
Use SM62 to define a system event, and select the Maintain setting (Figure 4) for the User event names option and enter. Clicking on the create icon, which is the blank-sheet icon in Figure 5, allows you to define and name a user event.

Figure 4
The Display/Edit Events screen allows you to select the Maintain option for user events

Figure 5
The Edit User Events screen allows you create and maintain user events
To trigger the event, go to transaction SM64. In the Name field (Figure 6), enter the event and click on the Trigger button. Note that the Parameter field shown in Figure 6 allows further qualification of an event, and only requires an entry if the functionality is being used.

Figure 6
The Trigger Event in Background Processing screen allows you to trigger events
Event Chains
Event chains are powerful tools that allow you to build dependencies within a load process. They ensure that subsequent processes cannot start until all events within an event chain have been successfully completed. Using the SubseqProcessing feature, you have the added flexibility of triggering subsequent events upon the success or failure of the overall event chain.
Event chains are constructed in the Administrator Workbench (AWB), using the menu path Tools>Event Processing Chains. Figure 7 shows the particular components of an event chain.

Figure 7
Event chains constructed in the Administrator Workbench can split a load into a number of parallel processing streams
In this example, a load has been split into a number of parallel processing streams, which are evident by the five events contained in the event chain. At this point in the event chain, two processes still must be completed — B2PBBC110A.0 and B2PBBC110C. The green check mark beside the other three specific events indicates that the processes defined by these events have finished.
Once all events within an event chain have been triggered, it is possible to initiate a subsequent process. One advantage of an event chain is the ability to initiate a single process, on the condition that a number of prior processes have finished. Note the green check mark on the subsequent processing box in Figure 7, indicating that a subsequent event trigger has been defined.
InfoPackage Groups
Another useful piece of standard BW functionality is provided by InfoPackage groups. They allow you to schedule and manage multiple InfoPackages within a single InfoPackage group. They have the same scheduling options as a batch job and offer significant flexibility. InfoPackage groups can also be used to initiate subsequent processes such as event triggers, or execute a Business Add-In (BAdI) or function module.
Like event chains, InfoPackage groups are created from within the AWB. Select InfoSources and click on the InfoPackage group icon, which is circled in Figure 8. Next, right-click on the InfoPackage group top node and select Create. To include individual InfoPackages, drag and drop them into the InfoPackage group.

Figure 8
You can drag and drop individual InfoPackages into an InfoPackage group and schedule it for automatic processing
The InfoPackage group functionality includes a Scheduler screen (Figure 9). You can use it to access the same type of subsequent processing available from within the Schedule tab for event chains. The InfoPackages tab allows you to specify which request is active within a InfoPackage group as well as make other settings.

Figure 9
The InfoPackages tab allows you to: (A) Specify if a request is active within an InfoPackage group; (B) Set the order requests are to be processed; (C) Require that requests commence only after the successful completion of the previous request
You Must Be Creative to Automate
Using standard BW functionality will bring you a long way toward building an automated BW load process. To build a completely automated load process, however, you will probably need to go beyond the standard feature set and start thinking creatively to meet the needs of your specific business. Remember, your aim is to develop a system with fully automated loads. Manual intervention should be the exception, not the rule.
Before you actually start building your autoload processes within BW, first model your design in Excel. This ensures that you have a solid understanding of what is required to build the load. By putting it on paper, you can determine what the critical components of the load are such as which events must be created, what order the batch job programs need to process in, what jobs can run in parallel and those to be run serially, etc. Modeling your design also yields a documented load process that makes both the build effort and ongoing support significantly easier.
As you create your model, you’ll begin to understand why I stressed creative thinking. Being creative and looking for solutions that are beyond the ability of the standard BW feature set allow you to cope with the myriad of problems posed by the unique demands of each business, such as:
- Automating a month-end roll process based on a work schedule defined from a factory calendar
- Streamlining a load by parallel processing on both R/3 and BW
- Monitoring loads to ensure if processing fails, personnel are notified
- Finding the best way to load data when you don’t know when the source system files will arrive
These are some of the issues that I faced, and addressing them was — and continues to be — important to fully automate a load process in my BW system. The rest of this article is devoted to answering these types of questions using a combination of standard functionality, custom coding, and a little creativity.
TVARV, OLAP Variables, and InfoPackages
TVARV is a standard system table available in both R/3 and BW. It is designed to hold variables and their assigned values for interrogation. InfoPackages can be held centrally in table TVARV, and OLAP variables within the InfoPackages can be used to dynamically set them. When OLAP variables within InfoPackages were introduced in version 2.0B, they effectively eliminated the need to make monthly manual changes to InfoPackage selections.
To take advantage of OLAP variables within InfoPackages:
- Create a suitable variable using transaction RSZV or by following the menu Business Explorer>Maintain Variables (Figure 10). These variables must be a characteristic value and processed by a user exit.
- Add code to the standard user exit for global variables (EXIT_SAPLRRS0_001) in the ZXRSRU01 include. This is where you direct the processing toward TVARV to retrieve values for the OLAP variable (Figure 11). This code is available to download by clicking this link.
- Using the Select data tab (Figure 12) of the Maintain InfoPackage interface, select Type 7 (OLAP variable) from the Type column, then select the specific OLAP variable that has been created for the InfoPackage selection. This will be saved when the InfoPackage is saved and scheduled.
- All of the above functionality comes standard within BW. Note that TVARV can store all relevant variables and their associated values, not just the OLAP variables I’ve mentioned in this example.

Figure 10
Create variables to be used within an InfoPackage that can dynamically update InfoPackage selections

Figure 11
Add the code circled in the figure to direct the processing toward TVARV to retrieve values for the OLAP variable

Figure 12
Use the Select data tab on the Maintain InfoPackage screen to select the specific OLAP variable
My company has developed utilities that allow the TVARV variables to be updated monthly so they automatically “roll” to the appropriate selections for month-end reporting. The TVARV update utilities run in accordance with a factory calendar at month end. As a result, the month-end roll process is completely automated.
Load Monitoring
The concepts behind load monitoring are simple and based on standard BW functionality. Some custom programs sit in the middle of the process. For example, code can be added so that load failures initiate a call to the data center to ensure that the environment is available to users across all time zones at all times.
The overall daily financials load process we use at Barclays is broken into three streams:
- Transaction (fact) extracts
- Master data loading and InfoCube processing
- Aggregate filling
A central control table within BW has been defined that allows tolerances time to be established for each of the above streams. Figure 13 displays the tolerance that has been set for the Transaction extract stream.

Figure 13
This screen is used to set a time tolerance for a transaction data extract stream
Each of the three streams triggers a job-start and job-finish batch job. Note that the event triggering the start and finish of the transaction data stream also triggers the monitoring batch jobs. An event, then, can be triggered once and still initiate multiple processes, as demonstrated in Figure 14.

Figure 14
A single event can be used to trigger multiple processes
A custom ABAP program contained within the batch job writes a job-start and job-finish message to a log file. A second custom program calculates the difference between the start and finish times. If the result is greater than the tolerance defined in the above table, a callout is sent to a member of the development team and the load is resumed.
Text File Cockpit (Non-R/3 Sourced Data)
The BW environment at my firm is the strategic reporting tool for management information. There are also a number of reporting requirements for data that is held outside of R/3. Reporting non-R/3 data from within BW allows the data to be reported against master data structures that are centrally maintained in R/3 and loaded into BW.
A text file monitor was developed at my firm to address the need for BW to cope with more and more non-R/3 sourced data, and to help automate the process. Underpinning the text file monitor are some of the standard features noted earlier, such as event, event chains, and InfoPackage group functionality.
The custom text file monitor process supports data load integration of non-R/3 source data into an already complex daily load process across many InfoCubes. The logic behind the process is simple. Text files are sent via FTP from the relevant source system to a central directory, and each file represents one event in an event chain. The directory receiving the files is polled once every 15 minutes using a custom ABAP program searching for files with a particular mask defined in the text file monitor (Figure 15).

Figure 15
Custom text file handling program developed for non- R/3 text file loads
When files are detected, an event is triggered, which represents one stream in an event chain. Once all the files are received and all events in the event chain have been triggered, the event chain initiates a subsequent event, which starts an InfoPackage based on a text file data source.
I recommend that source systems send files with a date and time stamp included in the file name. This is so source files have unique names and as such, cannot be overwritten should a load process be switched off for system maintenance, etc. It is not possible, however, to create an InfoPackage from a text data source with a file name that continually changes. The program behind the text file monitor must copy the files defined by the file location and mask them in the control table as they are received (Figure 16). Then, using a common name, rename them for use within an InfoPackage.

There is additional functionality, so you can establish a load window to restrict the InfoPackage group’s load times to those defined in the text file monitor. If the text files arrive outside of the defined load windows, then the load waits until the next window opens.
My company has quite a number of data feeds from non-R/3 systems, and this functionality allows us to control the loading of data from these systems. Prior to implementation of this customized file handling process, it was difficult to control the timing of the loads, which resulted in the non-R/3 data loads getting in the way of the scheduled R/3 loads. Now, my firm is able to control loading across both R/3 and non-R/3 data sources.
Overall, thinking outside of the box has allowed us to use standard functionality most effectively and has made what could be a manual daily process run smoothly and hands-free — automatically!
Luke McElwaine
Luke McElwaine is the BW Team Manager at Barclays Capital. He is lead developer at the firm with four years of experience providing BW solutions, and eight years of experience working with SAP across Financials, Controlling, and Special Purpose Ledger. Before joining Barclays Capital in 1997, Luke worked for Andersen Consulting as a member of its SAP Core Group.
If you have comments about this article or publication, or would like to submit an article idea, please contact the editor.