Speed up the system refresh
Veröffentlicht von Shortcut IT in Sc4SAP · 27 August 2024
Tags: SAP automation;SAP Systemkopie;SAP system copy;SAP system refresh;R3trans;PCA
Tags: SAP automation;SAP Systemkopie;SAP system copy;SAP system refresh;R3trans;PCA
Doing system refreshes of your quality / test systems is important for their quality and usability. But it could turn into a big effort, and additionally there is time pressure. After all, the systems should be available again quickly. With "Shortcut for SAP systems" and the information from the "PCA" tool, the pre- and post-processing can be automated. Read here, how you can speed up processing the data during pre- and post-processing significantly.
This article is a continuation of this one. So if doing a system refresh with "Shortcut for SAP systems" is new to you, you should first read this article.
The topic in this article is the speed-up of the pre- and post-processing. With this you can significantly reduce the runtime of the pre- and post-processing of the system refresh!
There are 2 starting points:
1.: Between the several components are intersections of the tables. Have a look for example at components SWU3 and SWU3_DATA: you will find lots of tables that are part of both components. Or components BSSUSER and USER: all tables of BSSUSER are also part of component USER. Consequently, lots of tables will be exported / imported twice! Regarding optimization of the runtime a single export/import of these tables is to be preferred.
2.: Use parallelization! By using a single XML file for the export and import tasks, task by task in the XML file will be processed sequentially - the next task will be started after the previous task has been finished.
But usually we have a lot of resources in the SAP system available when doing the export or import. There are no users or batch jobs active on the system in this phases. So just leverage the system by using some more work processes instead of a single one.
From the PCA tool we get the information of all components and their tables, and using the program supplied with our product we also get the information about the table size. Let's use this information to divide export and import into pieces - balanced by using the size information - and start them in parallel.
Also R3trans offers the possibility to use parallelism at the import - big files will be splitted into portions and processed by child processes of R3trans. This can reduce the runtime of an import with R3trans significantly.
Also R3trans offers the possibility to use parallelism at the import - big files will be splitted into portions and processed by child processes of R3trans. This can reduce the runtime of an import with R3trans significantly.
With the recent version of "Shortcut for SAP systems" and the program supplied with our product (ShowPCATables.txt in the ABAP folder, or downloadable here) the realization of the parallelism - by using multiple work processes as well as using R3trans parallelism - becomes quite easy! Let's make a walkthrough.
We start the program and select the "Refresh" option. Notice, that for the approach to parallelize the tasks with a good balancing of the tasks - based on the size of the tables / components - it is necessary to run the program in the target system of the refresh.
By default the PCA tool lists components NRIV and SNRO, both contain table NRIV (number rangens) with its full content. As we are going to do a system refresh from our productive system including all business documents, we take also the number ranges from the productive system and leave this table out here (1).
The selection screen accepts some information that is written to the XML files. We create XML files for a system refresh (2) and specify the connection (3) we want to use. It is convenient to have all data and log files in a dedicated directory (4), which we also specify here.
And now it's time to optimize performance:
- With activation of the flag "Avoid multiple export/import of the same table" (5) a table will be exported / imported only once - although it might be part of multiple components. So with this we cover the 1st of the above mentioned starting points to reduce the runtime.
This requires that for export and import the same components will be processed! If a table t is part in components compA and compB, for one of these components the table t will not be exported! So the data file of one of the components will be incomplete. If lateron both components will be imported, this does not matter, as table t will be imported with one of these components!
- The 2nd above mentioned starting point will be realized by specifying the amount of parallel tasks (6).On my small test system I set the value to 3 - means, for export as well as for import the tasks will be distributed to 3 files, balanced as good as possible based on the data size of the components. These 3 XML files can be processed in parallel. Each execution occupies a dialog work process.
Feel free to leverage the system with more tasks than 3. However, you should not use all available DIA work processes of your system, finally few work processes should be reserved for being able to have a look into the system. - If there are components with a huge amount of data, the balance regarding the data volume (and also the runtime) might become less optimized. Think of a single component with 100 GB data volume, whereas all other components in sum have 20 GB data volume only. The export and import of this single component would dominate the runtime. When the job has been finished for all other components, this single component would still be in process.
With a flag at "Split big components..." (7) such a big component can be splitted into several parts and with this a better balancing of the parallel tasks can be achieved. - And finally we can use the parallelism option of R3trans (8).
The best amount of parallel child processes of R3trans depends on CPU's, database capacity, I/O, location of the database server etc.
You can find more details in SAP note 1127194. There is a rule of thumb given: setting the parallel value in the range of 1 to 2 times the number of CPUs results in the highest performance gains.
After clicking on "Execute" we will get the list of the tables according to the PCA tool and also some size information. Feel free to use the possibility of setting filters to exclude components you want to get from the source system of the system refresh, so these components are not to be exported and imported lateron. You can use the ALV layouts to store the filter settings so that you do not have to set the filters every time you use the program.
After clicking on the "XML file" button we will be asked for the directory for the generated XML files.
The program now stores 6 XML files in the directory:
- 3 for the export (as we specified 3 parallel tasks)
- 3 for the import.
The components with their tables were distributed to 3 XML files, with consideration of their size - as the size is an indicator for the runtime approximately needed. For optimizing the runtime, the target is to start 3 tasks in the system with similar workload. If all 3 tasks need nearly the same runtime - and finish at nearly the same time - the balancing is quite good.
After the XML file generation the program gives some summarized information about the content.
As we used the option for avoiding multiple exports/imports of the same tables, we now have a smaller amount of tables in comparison to the ALV list of the tables, which has 166 more lines!
The XML files contain the export and import stuff for each component:
Of course, in front of the export some preparations are to be done. Beside the planning and announcement, before starting the export the users are to be locked and logged off, batch jobs are to be suspended etc. This has to be done in advance, and we can do this similar to the export - using the command line tool with an XML file. You will find an example for this in the "XML" folder of our product.
Now, that we are using tasks in parallel, the start of the prework processing (including user locking, suspending batch jobs etc. and the 3 export tasks), it is not only a single call of the "Shortcut for SAP systems" command line tool. First the user locking stuff has to be done. After finishing this the 3 export tasks are to be started in parallel. This makes it a little bit more complicated. Starting tasks in parallel on OS level is not a big issue on a Windows computer, but we also want to know when the tasks have been finished. Thinking about automation, after finishing the export of all data, the system copy tasks - including the database copy - are to be started.
I copied the generated XML files for the Export into a directory for the prework and renamed them with "Export_task1.xml", "Export_task2.xml" and "Export_task3.xml", which makes my approach to execute the tasks in parallel a bit more generic. Have a look at this command file "main.cmd", executing the prework of the system refresh, which uses a Powershell script that controls start and end of the parallel tasks::
First the command line tool is called with the XML file containing user locking, suspending batch jobs etc.
After this has been done, a Powershell script "StartTasksAndWait.ps1" is executed. You can find this script as part of all files used in this article here or supplied with our product (folder xml, file "Example_Systemrefresh_runtime_optimized.zip".
It works as following:
- it reads a file "ParallelTasks.txt" that contains the calls of the command line tool with the XML files for the export
- for each line an asynchronous task is started
- it periodically checks (default: every 5 seconds) whether the tasks are still active
- after all tasks have been finished, the Powershell script ends.
So in addition to this we have to fill the file "ParallelTasks.txt":
The single command files for the export tasks are just calls of the command line tool with the XML files for the export:
It may seem a bit complicated, but it is not. Once setup, you can use it again and again.
Finally we have a single command file, executable on OS level, automatable with any suitable tool, doing the complete prework including user locking, suspending batch jobs etc. and executing the export in parallel tasks in a balanced manner, saving much runtime!
After starting "QX1_Systemrefresh_Export.cmd" the procedure starts, and if you arrange the 4 raising console windows, it will look similar to this:
The upper left window (1) shows the console for "main.cmd", the other windows are the 3 tasks for the export, started in parallel. In window (1) the tasks (windows (2), (3), (4)) will be checked perodically. Once all tasks have been finished the prework of the system refresh is done - we saved much time by not exporting / importing tables redundantly and by using parallel tasks!
Especially for the import phase we will save a lot of time with the combination of parallel tasks (using multiple work processes in the SAP system) and the parallelism option for R3trans. It could not go any faster!
Find all files necessary for the pre- and postprocessing of a system refresh supplied with our product (folder xml, file "Example_Systemrefresh_runtime_optimized.zip"), easily adoptable to your needs. Alternatively you can download the ZIP file here.
Es gibt noch keine Rezension.