In SAP NetWeaver BI 7.0, you can join multiple authorizations. However, the union is only possible at an aggregated level. Find out how to implement analysis authorizations for unions and see a custom solution that helps overcome this limitation.
Key Concept
An analysis authorization is a new security concept as of SAP NetWeaver BI 7.0. It is different from the conventional authorization object concept and helps overcome limitations that existed when it comes to reporting and analysis. Enhancements include, for example, the ability to base authorizations on an unlimited number of InfoObjects and to grant access to transaction data by navigation attributes. Also, you can now set individual InfoObjects to "authorization relevant," regardless of the InfoProvider. Furthermore, you can set validity for individual authorizations instead of the entire role. Finally, you can join multiple authorizations, which is a significant improvement from the previous security model.
In SAP NetWeaver BI 7.0, SAP has made several enhancements to the BI security model. The most important one is the ability to do a union of several authorizations at an aggregated level. However, some limitations still do exist when you are working with standard analysis authorizations.
Let’s say, for example, that authorizations in a CRM pipeline analysis report are based on three objects: industry, service area, and region. The user has access to the telecom industry, plus the service area enterprise applications within the Midwest region. The requirement in this case is to show all opportunities that meet the security criteria, or, in other words, the union of all security objects shown in Figure 1. Prior to analysis authorizations it was only possible to show the intersection of all the security objects or data that meets all three requirements (Figure 2).

Figure 1
Union of security objects

Figure 2
Intersection of security objects
With analysis authorizations it is now possible to show the union at an aggregated level. From there, users can then drill down into individual areas and look at the details. In this case it is possible to report on the overall pipeline value for all of telecom opportunities plus enterprise application opportunities in the Midwest.
However, what if the requirement is to show a union of all the transaction level details? This is not possible using security based on analysis authorizations. For example, assume the user is an industry leader for telecom and also happens to be the service area leader for enterprise applications in the Midwest region. When this user executes the dashboard/report he expects to see details for all the opportunities that fall under his area. In such a scenario you need a custom approach because analysis authorizations only allow reporting at an aggregated level without all the transaction details.
In this article we will cover two scenarios. In the first, we create a union of transaction details, which we will show you how to implement using a custom security approach. Second, we will briefly explain how to do a union at an aggregated level using standard analysis authorizations for requirements where reporting at an aggregated level is sufficient.
Scenario 1
On a recent CRM project we were faced with some very complex security requirements for a CRM leadership dashboard. We had to derive authorizations from a complex security matrix based on eight different attributes. After evaluating standard authorizations, we turned to a custom approach that was simple, efficient in terms of performance, and allowed us to meet 100% of the security requirements.
In a nutshell the idea is to classify all the records based on the different attributes that need to be restricted. You then filter in the query based on the attribute values that a user has access to using a user exit variable. A three-step approach is necessary to implement this:
Step 1. Enhance the data model to store the classification for all the records and users
Step 2. Create transformations that determine the data classification and user roles at load time
Step 3. Implement a user exit that can be used in a query to filter based on data classification and user roles
Now let’s look at it in more detail and implement the above mentioned scenario for a CRM report. The security in our example restricts on industry, region, and service area.
Step 1. Enhance the data model to store the classification for all the records and users. Start by enhancing the opportunity line item InfoCube in Administrator Workbench (transaction RSA1). For this example we created a new, simplified version of this InfoCube in Administrator Workbench that contains opportunity header, line item number, client number, service area, and transaction key in its dimensions and sales revenue in the fact table.
You may have noticed that we did not include the region and industry in the InfoCube. The two objects in our example are client master data so we make them attributes of the client number and store them in the client number master data table as depicted in Figure 3. We use the values from the master data table when needed in our transformation that we will build in the second step.

Figure 3
InfoCube layout with all the characteristics including the new transaction key and the security string values
The service area is usually specified at the opportunity line item level, which means that an opportunity can have multiple line items. As a result, it can have multiple service areas, too. Therefore, you must secure the data at the most granular level, which is the opportunity line item (transaction level).
To do so, you need to come up with a new key that is more granular than opportunity number. In this case, it can be a new characteristic that stores the opportunity number concatenated with the line item number. This new characteristic — which we call the transaction key in our example — has a navigation attribute called security string. This string contains all the attribute values in a string format for each opportunity line item, as indicated in Figure 3. Table 1 shows the contents of this InfoCube in a tabular format along with the transaction key and the security string that are determined at load time in the transformation in step 2.
00100201 | 010 | EA | 100100 | MW | TEL | 00100201010 | ZSAEAZITELZRMW | 1,000,000 | 00100201 | 020 | IS | 100100 | MW | TEL | 00100201020 | ZSAEAZITELZRMW | 2,000,000 | 00100201 | 030 | | 100100 | MW | TEL | 00100201030 | *ZITELZRMW | 3,000,000 | 00100202 | 010 | IS | 100200 | SE | | 00100201010 | ZSAIS*ZRSE | 1,000,000 | |
Table 1 | Contents of the InfoCube in Figure 3 shown in a tabular format |
The transaction key comprises the opportunity number concatenated with the line item number, providing a characteristic as granular as the data in the InfoCube. If you were to implement security only at the region and industry level and exclude service area, then the opportunity number alone would suffice.
Note how the security string is made up of the attribute identifiers (e.g., ZSA for service area followed by the actual service area value for that record) in Figure 3 and Table 1. If a certain value is missing, then both the identifier and the value are replaced with an asterisk. For example, the third record in the table does not have a service area and as a result the string starts with an asterisk.
Make sure to pick attribute identifiers that do not exist in the code set for industry, service area, and region. You can also consider using a special character instead of Z to ensure uniqueness.
Another point to consider would be the length of the code. It is best if the length is consistent. For example, if you have two service areas, EA and EAT, then if the user has access to ZSAEA* she can potentially obtain access to both service areas. You can manipulate the codes in this solution to make them all the same length. You don’t necessarily have to change the actual codes in the master data tables. Instead, you can change these in the start routine in step 2. In this step you can convert the original codes into the corresponding new codes that you establish for this process.
Next you need to create a DataStore object (DSO) that contains the employee number and the security string as the key. In our scenario the user is the industry leader for telecom and the regional service area leader for enterprise applications in the Midwest. This means we need an intersection of region and service area plus a union of all industry telecom. Whenever you need an intersection, you have to include all values in the same string. If you need to join that with another value, then you include a new string. Table 2 shows the DataStore content in our example for employee 999999.
999999 | *ZITEL | 999999 | ZSAEAZRMW* | |
Table 2 | DataStore content |
The first record gives the user access to all of industry telecom and the second string gives him an intersection to the Midwest region and service area enterprise applications. When the user exit runs, it combines the two together returning a union of the two roles.
This particular DataStore loads data from a table where user authorizations are maintained. The information security team usually maintains this table either manually or with an automated process. SAP NetWeaver BI then pulls from this table and forms the security string necessary to filter reports.
Step 2. Create transformations that determine the data classification and user roles at load time. The transformations populate the security strings for opportunity line items and security roles for the users.
Figure 4 shows the entire data flow which is comprised of four different transformations that we will build. Transformation 1 loads the detailed data from SAP CRM into the opportunity line item DSO. This is a straightforward transformation with no additional logic — source fields are directly mapped to the fields in the DSO. Transformation 2 populates the master data for the transaction key, which is the security string, from the opportunity line item DSO. Transformation 3 transfers data from the opportunity line item DSO into the InfoCube and also populates the transaction key. Transformation 4 then determines the security string for each user and loads it into a user role DSO. Transformations 2, 3, and 4 require some routines and formulas that we will show you how to implement next.

Figure 4
Data flow for the entire data model
Let’s start with transformation 2, which fetches the opportunity number, line item number, and service area from the opportunity line item DSO. It then combines that information with region and industry from the client master data table to form the security string for each opportunity line item in the InfoCube. The security string is then stored in the master data table for the transaction key. To create a transformation, right-click on the InfoProvider and select Create Transformation.
Figure 5 shows transformation 2, which consists of a start routine, a formula to populate the transaction key, and a routine to populate the security string. The start routine reads the region and industry from the master data table as shown in Figure 6. You use these values later in the routine for the security string. The formula for the transaction key concatenates the opportunity number and line item number to form the transaction as shown in Figure 7. The opportunity number and item number are both set as inputs.

Figure 5
Transformation 2 consists of start routine, a formula for the transaction key, and a routine for the security string. Both opportunity number and item number are set as inputs for the formula.

Figure 6
The start routine for transformation 2

Figure 7
The transaction key formula concatenates the opportunity number and line item number to form the transaction
The routine for security combines the region, industry, and service area to form the security string. If a value does not exist, it puts in an asterisk instead. Refer to Figure 8 for the actual code for this routine. Make sure to use attribute identifiers that are unique and do not exist in the actual code sets.

Figure 8
The routine for the security string for transformation 2
This completes transformation 2. Next we will create transformation 4, shown in Figure 9, which computes a security string for each user ID and places it in a DSO. This process assumes that user roles — employee number plus the region industry and service area they have access to — are available in a table in the source system, which in this case is SAP CRM.

Figure 9
Transformation 4
In this transformation, the employee number is directly mapped to the source field. The security string field takes the region, service area, and industry as inputs and uses a routine to determine the security string for each employee number. Refer to Figure 10 for the source code for this routine. The code assumes that any user IDs coming in have access to at least one of the objects. Any users that do not have access to any of the objects are not be updated into the source table and are never extracted.

Figure 10
Source code for the routine in transformation 4
In step 1 we included the transaction key in the line item InfoCube and enabled the security string as a navigation attribute. Next you create transformation 3. In this transformation all fields are directly mapped. The transaction uses opportunity number and item number as input and concatenates the two in a formula, similar to transformation 2. Refer to Figure 7 for the actual formula.
Step 3. Implement a user exit that can be used in a query to filter based on data classification and user roles. The user exit variable fetches all the relevant security strings for a user from the user role DSO and passes it to the query. First you need to create a new user exit variable in BEx and then implement the user exit code as shown in Figure 11. Make sure to set LS_RANGE-OPT to CP, which refers to “contains pattern.” This allows use of the asterisk in the return values.

Figure 11
Code for user exit variable
This completes all the modifications needed. Next you can load the data to populate the master data table, InfoCube, and the DSO by creating data transfer processes (DTPs) for all the transformations in Administrator Workbench (transaction RSA1) and executing the DTPs. When that is done you can develop queries against the new InfoCube with BEx Query Designer. Make sure to include the security string field as a filter and restrict with the new user exit variable.
You can do so by clicking on the Filter tab. Drag Security String from the list of available characteristics into the Filter section, right-click on Security String, and then choose Restrict. On the next screen pick the user exit variable that you created.
Note that none of the objects including industry, service area, region, transaction key, or security string are authorization relevant. All security is handled by filtering based on the security string in the query so it is important to ensure that security string is included in all queries and always restricted with the user exit variable created earlier.
Performance and Maintenance
This method of filtering data based on user roles is efficient in terms of performance because the user exit only has to fetch a few records from the database depending on how many different roles a person is assigned. Due to the small number of records and the employee number being the key in the DSO, database time is minimal. Secondly, the transaction key is placed in the filter area of the BEx query as compared to free characteristics or rows, which also minimizes impact on query performance.
Making security string a navigation attribute as compared to being de-normalized in the InfoCube/DSO also minimizes the maintenance overhead and downtime. If the string changes due to changes in the master data (client master data in our case), a drop and re-load of the InfoCube is not necessary. A full load of the master data table suffices.
Scenario 2
In this scenario we will briefly touch on an alternate solution that uses standard analysis authorizations and allows the union of all the authorizations at an aggregated level. This process does not require enhancement of the existing InfoProvider to include the transaction key. Instead, you need to mark all three objects — service area, region, and industry — as authorization relevant.
Also you need to replace the user role DSO in Figure 4 with a copy of the 0TCA_DS01 DSO for storing authorization data. This DSO is part of the standard business content that is delivered with SAP NetWeaver BI 7.0. If it is not already active in your system you can activate it in Administrator Workbench under the Business Content sections as shown in Figure 12.

Figure 12
Activate Business Content DSO for analysis authorizations
When the object is active you then build a generic DataSource based on a table that pulls the user access data from the source system. This process assumes that the necessary user access parameters are available in a table. For our scenario the source table must contain the entries shown in Table 3. The authorization ID (auth ID) is a unique identifier for each role user has access to, the object name contains the authorization-relevant InfoObject name in the data model, and the object value contains the value to which the user has access. The BI security team maintains this table.
999999 | 100001 | Industry | TEL | 999999 | 100001 | Serv_Area | * | 999999 | 100001 | Region | * | 999999 | 100002 | Industry | * | 999999 | 100002 | Serv_Area | EA | 999999 | 100002 | Region | MW | 999999 | 100003 | Industry | : | 999999 | 100003 | Serv_Area | : | 999999 | 100003 | Region | : | |
Table 3 | Input data for analysis authorizations |
The table contains nine entries, but as you can tell from the auth ID field there are three distinct authorization roles (100001, 100002, and 100003). The first role specifies that the user has access to the telecom (TEL) industry regardless of the service area and region. The second role specifies that the user has access to enterprise application (EA) opportunities in the Midwest and all industries. The third role contains just a colon. For all three entries that means the union of the two roles is possible at an aggregated level — in other words when the industry, service area, and region are not in the drill down in the query. This is the primary difference between this and the first solution. The first solution does not have this restriction.
Next we will show you how to create a transformation that links the DataSource to the DSO. The transformation maps each of the fields in the source table to InfoObjects in the DSO. Object 0TCTUSERNM is mapped to the employee field, 0TCTAUTH is mapped to the auth ID, 0TCTIOBJNM is mapped to the object name, and 0TCTLOW is mapped to the object value field. Remaining fields can be populated as shown in Table 4. After the transformation is complete, you can create a DTP and execute it to load data into the DSO. After the data load the DSO content should look like Table 4. The value 0TCTHIGH is blank for all rows.
100001 | 12.31.9999 | Industry | I | EQ | TEL | A | BI | 01.01.1000 | 100001 | 12.31.9999 | Serv_Area | I | EQ | * | A | BI | 01.01.1000 | 100001 | 12.31.9999 | REGION | I | EQ | * | A | BI | 01.01.1000 | 100002 | 12.31.9999 | Industry | I | EQ | * | A | BI | 01.01.1000 | 100002 | 12.31.9999 | Serv_Area | I | EQ | EA | A | BI | 01.01.1000 | 100002 | 12.31.9999 | Region | I | EQ | MW | A | BI | 01.01.1000 | 100003 | 12.31.9999 | Industry | I | EQ | : | A | BI | 01.01.1000 | 100003 | 12.31.9999 | Serv_Area | I | EQ | : | A | BI | 01.01.1000 | 100003 | 12.31.9999 | Region | I | EQ | : | A | BI | 01.01.1000 | |
Table 4 | Data after it has been moved to the DSO. Each of the rows applies to 0TCTUSERNM user ID 999999. |
Now we are ready to generate the roles based on data in the DSO. You can do so via transaction RSECADMIN or by executing report RSEC_GENERATE_ AUTHORIZATIONS.
When the system generates the authorization role, user 999999 can report on sales values for telecom opportunities plus the enterprise applications opportunities in the Midwest region at an aggregated level. If the user wants to drill down into details, he can look at the details for the telecom opportunities or the enterprise applications opportunities in the Midwest, but not both at the same time.
Make sure to populate 0TCTUSERNM with the user ID to enable direct assignment of the authorization to the user’s profile when the authorizations are generated. In our example we assume that the user ID is 999999.
This solution is definitely simpler but has some limitations. However, if the requirements can be met using this solution then it is preferred over the custom solution in scenario 1.
Marc Bernard
Marc Bernard is a chief architect for SAP NetWeaver, specializing in SAP NetWeaver Business Intelligence, Business Planning, BI Accelerator, and Enterprise Search. In his role as chief architect, he has a strategic focus on future development topics, works closely aligned with the Development Architecture team, and is a trusted advisor for SAP’s customers. Marc has spoken at many SAP-sponsored events and is also a moderator on the SAP Developer Network.
You may contact the author at marc.bernard@sap.com.
If you have comments about this article or publication, or would like to submit an article idea, please contact the editor.
Shahid Manzur
Shahid Manzur is the BI architect for the Price Optimization for Banking Solution at SAP Labs. Prior to SAP labs, Shahid has worked for Deloitte and TUI in various BI development roles. He has completed many SAP NetWeaver BI implementations and served as technical lead on several of these projects. Shahid has extensive data warehousing experience in areas such as data modeling, ETL, security, reporting, performance tuning, and cross-application integration.
You may contact the author at shahid.manzur@sap.com.
If you have comments about this article or publication, or would like to submit an article idea, please contact the editor.