Convergence of the SAP BI client portfolio (reference link)

Details of a webcast from Alexander Peter from SAP about the client conversion in reporting (and planning). Found it to be quite usefull, so thought to share it via my blog. See the link below to the actual article written by Tammy Powlas.

Reference: http://scn.sap.com/community/businessobjects-analysis-ms-office/blog/2015/07/16/analysis-office–bi-and-planning-clients-in-microsoft-office-convergence-asug-webcast

What to do when you destroyed the LIS (MCEX) queue structure

This blog item is about how to fix your LIS queues, using V3 updates, after queues got corrupt while modifying the MCEX and regular delta queue and you can’t perform a new setup.

Ofcourse, this sounds stupid: destroy your LIS queues?! Every knowledgeable SAP BW consultant must know that you don’t want to transport changes to LIS extractors (structure enhancements) from one system to another while data is still in the queue or when people are still working with transactions impacting these queues (e.g. sales order creation transaction VA01 or sales order change transaction VA02 vs 2LIS_11_* extractors). Well, I did know this too, but if a transport gets imported on the SAP ECC side without you knowing or forgetting to take the proper precautions locking users and cleaning MCEX/delta queues the issue can occur even if it’s not the fault of the consultant. SAP LIS 11 puts data physically in a file as well, after changing the queue structures, the system can’t read these files anymore.

So how to deal with it when you can’t do a full new setup? Beware: this can ONLY be done for DSO’s where key figures are set to overwrite! Additive loads towards DSO’s or infocubes are not fit for this solution.

 

Investigate erroneous records / ST22

First of all; you’ll see some issues in ST22 with users that have tried to post documents to the queue on the moment of transport import on the system. These documents can’t be processed anymore in the regular delta queue with the new structure. Documents posted during your import will be stuck and won’t be able to be processed from the MCEX to the 2LIS_11_* delta queues using the V3 ABAP (for 2LIS_11* this is RMBWV311). These documents need to be deleted from the queue in SMQ1.

LIS-destroy-ST22

 

LBWE job will show issues processing the MCEX queue

The job log of the LBWE scheduled job for 2LIS_11 will also learn you the job could not run anymore. If needed, deschedule this job using LBWE for as long as you need to resolve this issue.

LIS-destroy-SM37

So first determine your start of the transport import minus 1 minute and the end of the transport import plus 1 minute (safety intervals). The documents posted in SMQ1 during this window can’t be processed with the RMBWV311 ABAP anymore.

 

Investigate/delete SMQ1 records

In the example below you find the SMQ1 entries for 2LIS_11*. Before you delete the entries from around the import, you can find the sales document number in the monitor record itself.

LIS-destroy-SMQ1 record

 

Write down the document numbers

When clicking on one of the records and scrolling down you will find the sales document number (in this case: 6148290 – knowing the number ranges is handy to know what you’re looking for). Note these sales document numbers in an Excel or somewhere else, and only after this you can delete the record from the SMQ1 queue.

LIS-destroy-SMQ1-detail record

After going through the list of erroneous records around the import time of the transport, check whether you can run the V3 ABAP again filling the regular delta queue – if so you’re almost done…

 

Execute a new setup, only for the deleted documents from MCEX queue

Now we want to load the new/changed documents that we deleted from the SMQ1 MCEX queue to BW as well. You can do this using a regular setup, in this case by deleting the application 11 data using transaction LBWG and performing a setup per sales document using transaction OLI7BW.

LIS-destroy-OLI7BW

Once you are done – see that the sales document numbers noted down in your list from the deleted records from SMQ1 corresponds to the amount of records now in the setup table. Verify that using RSA3 for the header extractor – 2LIS_11_VAHDR.

LIS-destroy-RSA3

 

For the DSO’s impacted you can do a full repair load so history is complete again.
Don’t forget to (re)schedule the LBWE V3 job again.

Listing all queries in a workbook using a function module (RRMX_WORKBOOK_QUERIES_GET)

In my current environment I’m working with workbook which contain a lot of queries. When analyzing the workbooks for a redesign project I wanted to know which queries are included in the workbook. For this purpose there is a function module you can use: RRMX_WORKBOOK_QUERIES_GET in transaction SE37.

RRMX_WORKBOOK_QUERIES_GET - SE37

You need the technical id of the workbook, that you can find in the properties window when searching for or opening a workbook in the BEx analyzer.

Let’s take a workbook id: AS9F0Q0JZ8V7DJKOE669JHG5I; from experience I know this is a workbook with a lot of tabs, pivot tables and local excel logic.
Enter this id in the I_WORKBOOK field; object version A and run (test) the function module.

Output, there are 17 queries included in this workbook:

RRMX_WORKBOOK_QUERIES_GET - Output
If you want to know which queries; press the table icon next to the 17 entries statement, it will give you an overview of the queries and their technical names:

RRMX_WORKBOOK_QUERIES_GET - output details

By the way… when searching for this function module, a lot of interesting ones came up. See for yourself… This might come back in a future post 😉

RRMX function module search

Reference: Tables in SAP BW which give information related to Queries, Work Books and Web Templates

Start process chains, regardless of their scheduling settings, using an ABAP program

General

Loading data in SAP BW step by step is normally done by process chains which can be scheduled in a certain point in time. However, regardless of that scheduling, it can be that you want to run such a process chain although it is scheduled and you don’t want to change that scheduling.

For that purpose I’ve written a program ZBW_START_CHAIN that can start individual process chains, regardless of their schedulling settings.

Selection screen

The selection screen is straight forward and simple, you just need to select a technical name of one of the process chains in the system and press execute.
ZBW_START_CHAIN

After pressing F8 or execute the system will ask you with which priority the job must be run, normally just use priority C.

priority selection

When you confirmed the priority, the program will come up with a message the process chain has started! Check transaction RSPC and go to the log of the process chain, and see for yourself data is (being) loaded. 🙂 Easy!
process chain log id

Code of the program

 

Optimize system parameters – Part 3 – SAPI_SYNC parameter in RSADMIN table in SAP BW

SAP_RSADMIN_MAINTAIN - SAPI_SYNC settings

To optimize data loads in SAP BW from a DSO to an infocube you can set the SAPI_SYNC parameter in table RSADMIN. This parameter enables that these datamart loads e.g. from DSO to infocube bypass tRFC. The system no longer uses tRFC to send the data within a BW system. This deactivates the RFC compression and decompression, and the data is updated faster.

Measuring performance, this makes these loads about 20 to 30% faster. Be aware: this does not optimize any transformations or update rules used. Some additional performance boosts can be found optimizing any ABAP code in there, but that’s not in scope of this post 😉

To set this parameter, start program SAP_RSADMIN_MAINTAIN in transaction SE38.

Parameter input settings:

OBJECT = SAPI_SYNC
VALUE = X

After setting this parameter, enter /$tab in the transaction field in the SAPGui

Important note: If the system no longer sends the data using a process that is called via tRFC, it ignores load balancing settings, such as the use of logon groups. All of the parallel processes are started on the server on which the data is extracted. In addition, there is no longer an upper limit for the maximum number of parallel processes (depending on the procedure due to the parameter MAXPROCS in the table ROIDOCPRMS when using the tRFC scheduler using SAPI or due to the parameter “Max.Conn.” in transaction SMQS when using the new SMQS scheduler). Please also read the post about the settings in table ROIDOCPRMS.

Reference: please also read SAP note 595251.

1 2