Convergence of the SAP BI client portfolio (reference link)

Details of a webcast from Alexander Peter from SAP about the client conversion in reporting (and planning). Found it to be quite usefull, so thought to share it via my blog. See the link below to the actual article written by Tammy Powlas.


What to do when you destroyed the LIS (MCEX) queue structure

This blog item is about how to fix your LIS queues, using V3 updates, after queues got corrupt while modifying the MCEX and regular delta queue and you can’t perform a new setup.

Ofcourse, this sounds stupid: destroy your LIS queues?! Every knowledgeable SAP BW consultant must know that you don’t want to transport changes to LIS extractors (structure enhancements) from one system to another while data is still in the queue or when people are still working with transactions impacting these queues (e.g. sales order creation transaction VA01 or sales order change transaction VA02 vs 2LIS_11_* extractors). Well, I did know this too, but if a transport gets imported on the SAP ECC side without you knowing or forgetting to take the proper precautions locking users and cleaning MCEX/delta queues the issue can occur even if it’s not the fault of the consultant. SAP LIS 11 puts data physically in a file as well, after changing the queue structures, the system can’t read these files anymore.

So how to deal with it when you can’t do a full new setup? Beware: this can ONLY be done for DSO’s where key figures are set to overwrite! Additive loads towards DSO’s or infocubes are not fit for this solution.


Investigate erroneous records / ST22

First of all; you’ll see some issues in ST22 with users that have tried to post documents to the queue on the moment of transport import on the system. These documents can’t be processed anymore in the regular delta queue with the new structure. Documents posted during your import will be stuck and won’t be able to be processed from the MCEX to the 2LIS_11_* delta queues using the V3 ABAP (for 2LIS_11* this is RMBWV311). These documents need to be deleted from the queue in SMQ1.



LBWE job will show issues processing the MCEX queue

The job log of the LBWE scheduled job for 2LIS_11 will also learn you the job could not run anymore. If needed, deschedule this job using LBWE for as long as you need to resolve this issue.


So first determine your start of the transport import minus 1 minute and the end of the transport import plus 1 minute (safety intervals). The documents posted in SMQ1 during this window can’t be processed with the RMBWV311 ABAP anymore.


Investigate/delete SMQ1 records

In the example below you find the SMQ1 entries for 2LIS_11*. Before you delete the entries from around the import, you can find the sales document number in the monitor record itself.

LIS-destroy-SMQ1 record


Write down the document numbers

When clicking on one of the records and scrolling down you will find the sales document number (in this case: 6148290 – knowing the number ranges is handy to know what you’re looking for). Note these sales document numbers in an Excel or somewhere else, and only after this you can delete the record from the SMQ1 queue.

LIS-destroy-SMQ1-detail record

After going through the list of erroneous records around the import time of the transport, check whether you can run the V3 ABAP again filling the regular delta queue – if so you’re almost done…


Execute a new setup, only for the deleted documents from MCEX queue

Now we want to load the new/changed documents that we deleted from the SMQ1 MCEX queue to BW as well. You can do this using a regular setup, in this case by deleting the application 11 data using transaction LBWG and performing a setup per sales document using transaction OLI7BW.


Once you are done – see that the sales document numbers noted down in your list from the deleted records from SMQ1 corresponds to the amount of records now in the setup table. Verify that using RSA3 for the header extractor – 2LIS_11_VAHDR.



For the DSO’s impacted you can do a full repair load so history is complete again.
Don’t forget to (re)schedule the LBWE V3 job again.

Listing all queries in a workbook using a function module (RRMX_WORKBOOK_QUERIES_GET)

In my current environment I’m working with workbook which contain a lot of queries. When analyzing the workbooks for a redesign project I wanted to know which queries are included in the workbook. For this purpose there is a function module you can use: RRMX_WORKBOOK_QUERIES_GET in transaction SE37.


You need the technical id of the workbook, that you can find in the properties window when searching for or opening a workbook in the BEx analyzer.

Let’s take a workbook id: AS9F0Q0JZ8V7DJKOE669JHG5I; from experience I know this is a workbook with a lot of tabs, pivot tables and local excel logic.
Enter this id in the I_WORKBOOK field; object version A and run (test) the function module.

Output, there are 17 queries included in this workbook:

If you want to know which queries; press the table icon next to the 17 entries statement, it will give you an overview of the queries and their technical names:


By the way… when searching for this function module, a lot of interesting ones came up. See for yourself… This might come back in a future post 😉

RRMX function module search

Reference: Tables in SAP BW which give information related to Queries, Work Books and Web Templates

Start process chains, regardless of their scheduling settings, using an ABAP program


Loading data in SAP BW step by step is normally done by process chains which can be scheduled in a certain point in time. However, regardless of that scheduling, it can be that you want to run such a process chain although it is scheduled and you don’t want to change that scheduling.

For that purpose I’ve written a program ZBW_START_CHAIN that can start individual process chains, regardless of their schedulling settings.

Selection screen

The selection screen is straight forward and simple, you just need to select a technical name of one of the process chains in the system and press execute.

After pressing F8 or execute the system will ask you with which priority the job must be run, normally just use priority C.

priority selection

When you confirmed the priority, the program will come up with a message the process chain has started! Check transaction RSPC and go to the log of the process chain, and see for yourself data is (being) loaded. :-) Easy!
process chain log id

Code of the program


Optimize system parameters – Part 3 – SAPI_SYNC parameter in RSADMIN table in SAP BW


To optimize data loads in SAP BW from a DSO to an infocube you can set the SAPI_SYNC parameter in table RSADMIN. This parameter enables that these datamart loads e.g. from DSO to infocube bypass tRFC. The system no longer uses tRFC to send the data within a BW system. This deactivates the RFC compression and decompression, and the data is updated faster.

Measuring performance, this makes these loads about 20 to 30% faster. Be aware: this does not optimize any transformations or update rules used. Some additional performance boosts can be found optimizing any ABAP code in there, but that’s not in scope of this post 😉

To set this parameter, start program SAP_RSADMIN_MAINTAIN in transaction SE38.

Parameter input settings:


After setting this parameter, enter /$tab in the transaction field in the SAPGui

Important note: If the system no longer sends the data using a process that is called via tRFC, it ignores load balancing settings, such as the use of logon groups. All of the parallel processes are started on the server on which the data is extracted. In addition, there is no longer an upper limit for the maximum number of parallel processes (depending on the procedure due to the parameter MAXPROCS in the table ROIDOCPRMS when using the tRFC scheduler using SAPI or due to the parameter “Max.Conn.” in transaction SMQS when using the new SMQS scheduler). Please also read the post about the settings in table ROIDOCPRMS.

Reference: please also read SAP note 595251.

Optimize system parameters – Part 2 – RSCUSTV data load parameters in SAP BW

To optimize data loads in SAP BW transaction RSCUSTV6 offers an option to configure a number of parameters:

RSCUSTV6 settings

Using this frequency, you specify after how many data IDocs an info IDoc is to be sent, or how many data IDocs are described by an info IDoc. If the frequency is 1, each data IDoc is followed by an info IDoc. This setting applies to data transfer from non-SAP source systems. Info IDocs contain information about whether data IDocs were loaded correctly. In the extraction monitor, you can derive from each info IDoc whether the load process was successful. If this is the case for all data IDocs described in an info IDoc, the traffic light in the monitor is green.

: The larger the package size of a data IDoc, the lower you should set the frequency. By doing this, you can obtain information on the data load status at relatively short intervals when uploading data. In general, you should choose a frequency between 5 and 10, but not greater than 20.

Package size

The package size specifies the number of data records within a package that are delivered when you load data from the source. This setting applies to data transfer from non SAP source systems. You can import data packages into BI in parallel in the background. Thanks to the better utilization of system resources (upload is split across several work processes), parallel upload is more efficient in terms of system performance.

Recommendation: Do not spread the data set across too many packages because this has a negative impact on performance when data is loaded. The number of data packages per load process should not be higher that 100. The basic setting should be between 5,000 and 20,000, depending on how many data records you want to load. If you want to load a large volume of transaction data, change the number of data records per package from the default value of 1,000 to a value between 10,000 (Informix) and 50,000 (Oracle, MS SQL Server).

Partition size

The partition size specifies the number of records that is reached before the system creates a new partition for the PSA table. This value is set at 1,000,000 records by default. Only data records from a complete request are stored in a partition. The specified value is a threshold value. This setting applies to DataSources from all source systems.


Optimize system parameters – Part 1 – ROIDOCPRMS table settings

In the coming posts I will have some attention for optimizing system parameters in SAP BW and related SAP source systems. This first post is about optimizing load parameters in SAP BW and SAP related source systems.


ROIDOCPRMS table settings 2


To optimize data load parameters for SAP BW and SAP source systems additional parameters are available in table ROIDOCPRMS. These settings will improve the throughput time of data loads. Please be sure to set these parameters in every SAP BW and connected SAP system. The Src. system field must contain the system name of the system the ROIDOCPRMS table is present. In the screenshot above a BW system is taken as an example, I’ve set these parameters as well for the connected SAP ECC and SAP CRM systems.


Src. system: The technical system name of the system itself.

Max. (kb): While putting the package size on 50.000 kb, transferred data blocks become bigger and the data load will finish quicker. The individual records are sent in packages of varying sizes in the data transfer to the Business Information Warehouse. Using these parameters you determine the maximum size of such a package and therefore how much of the main memory may be used for the creation of the data package. SAP recommends a data package size between 10 and 50 MB.

Frequency: With this frequency you establish how many data IDocs should be sent in an Info IDoc. It ensures more blocks are sent before synchronization via IDocs take place.

Max. proc.: Maximum number of parallel processes for data transfer. The more parallel processes that can run the better the performance of your system will be. However, you must bear in mind the number of available dialog processes here. You don’t want to take down your ECC or CRM transactional systems. Value of 4 is sufficient for small to mid-size SAP BW environments.

Target system to run background job: In programs that schedule jobs, the name of the host system is evaluated according to the type of job step being run. For an ABAP job step, the background processing system looks for an SAP instance on the target system. For an external job step, the name is used directly as the target system for running the external program or command. For my own environments this option is not used.

Change settings

Using transaction SM31 you can maintain the entries in this table.


Know-How Network: SAP BW Data Load Performance Analysis and Tuning

Performance Tuning For SAP BW

Quick check ABAP code in BW transformations (and update- & transfer rules)

Where are objects used in in SAP BW transformations, transfer rules or update rules to read data from?
In this case we have a DSO object ZPUR_O05 we want to phase out; and want to know in which transformation it is read using abap coding to determine the impact.

To do a quick check:

  • Open transaction SE16(n) and look up table ‘RSAABAP”
  • Fill field RSAABAP-LINE with *ZPUR_O05* (this field/search is case sensitive!)
  • The field RSAABAP-CODEID can be used to look up RSAROUT-CODEID
  • For transfornations: If RSAROUT-CODETP for this record is “TF”; look in table RSTRAN with CODEID in field STARTROUTINE / ENDROUTINE or EXPROUTINE; it will give you the transformation id, source and target name.
  • For update rules, paste the  RSAABAP-CODEID into RSUPDROUT and for transfer rules, look it up in RSTSRULES (local conversion routine field). The same table also gives you the transfer- and communication structure. For Update rules, look up metadata RSUPDINFO.

Path looking up the example *ZPUR_O05*, the TRANID gives the actual transformation in SAP BW where the code resides.


1 2