Speed up the system refresh

Direkt zum Seiteninhalt

Speed up the system refresh


Doing system refreshes of your quality / test systems is important for their quality and usability. But it could turn into a big effort, and additionally there is time pressure. After all, the systems should be available again quickly. With "Shortcut for SAP systems" and the information from the "PCA" tool, the pre- and post-processing can be automated. Read here, how you can speed up processing the data during pre- and post-processing significantly.

This article is a continuation of this one. So if doing a system refresh with "Shortcut for SAP systems" is new to you, you should first read this article.

The topic in this article is the speed-up of the pre- and post-processing. With this you can significantly reduce the runtime of the system refresh!

There are 2 starting points:

1.: Between the several components are intersections of the tables. Have a look for example at components SWU3 and SWU3_DATA: you will find lots of tables that are part of both components. Or components BSSUSER and USER: all tables of BSSUSER are also part of component USER. Consequently, lots of tables will be exported / imported twice! Regarding optimization of the runtime a single export/import of these tables is to be preferred.

2.:  Use parallelization! By using a single XML file for the export and import tasks, task by task in the XML file will be processed sequentially - the next task will be started after the previous task has been finished.
But usually we have a lot of resources in the SAP system available when doing the export or import. There are no users or batch jobs active on the system in this phases. So just leverage the system by using some more work processes instead of a single one.
From the PCA tool we get the information of all components and their tables, and using the program supplied with our product we also get the information about the table size. Let's use this information to divide export and import into pieces - balanced by using the size information - and start them in parallel.

With the recent version of the program supplied with our product (ShowPCATables.txt in the ABAP folder, or downloadable here) this becomes quite easy! Let's make a walkthrough.

We start the program and select the "Refresh" option. Notice, that for the approach to parallelize the tasks with a good balancing of the tasks - based on the size of the tables / components - it is necessary to run the program in the target system of the refresh.
Show PCA tables

We will get the list of the tables according to the PCA tool and also some size information.
PCA tables with size information

We exclude the NRIV and SNRO component from the selection (using the filter function for the "Component" column), because we do not want to export / import the NRIV table. As we plan to do a system refresh this data is to be taken from the source system.
Exclude number range table (to be used from target system)

Of course you can do further reduction of the components resp. the tables. For example, possibly it makes sense to exclude also the BATCH_INPUT component and/or some others which you would like to keep from the target system. Decide on your own or come to an agreement with your colleagues and/or the user department, which components should be kept in the target system of the system refresh.

No we use again the "XML file" button to generate the XML files and choose the "Sys.refresh-Export" option.
Option(1) for generating XML file(s)

No we come to the 1st of the above mentioned starting points to optimize the runtime. The program asks us whether we want to avoid multiple exports for the same tables.

Notice, that this requires that you select the same components lateron for the import as you do now for the export! If a table t is part in components compA and compB, for one of these components the table t will not be exported! So one of the components will be incomplete. If you lateron select again both components for the import, this does not matter, as table t will be imported with one of these components!

Option for avoiding multiple export/import of the same table

We click on "Yes".

After that another popup comes up, dealing with the 2nd starting point mentioned above: parallelization.
Option for using parallelization in export/import

After clicking on "Yes" the program asks us for the number of tasks to be executed in parallel.
Popup for asking the amount of tasks

On my small test system I set the value to 3 - means, at the export 3 tasks for exporting the data will be executed in parallel. Each of them occupies a dialog work process. Feel free to leverage the system with more tasks than 3. However, you should not use all available DIA work processes of your system, finally few work processes should be reserved for being able to have a look into the system.

After that we are asked for
- the target directory for the export files
- a temp directory on our computer.
Here I created a directory systemrefresh in DIR_HOME which I want to use for export and import.
Paths for export files and temp directory

A last dialog window comes up, requesting the directory and the file name of the generated XML file(s):
Path and name for XML file(s)

After that the program generates 3 XML files and gives some summarized information about the content.
Popup with information about components / tables in the XML file(s)

As we used the option for avoiding multiple exports/imports of the same tables, we now have the same amount of "Total tables" and "Distinct tables".
The export commands for selected components with their tables were distributed to 3 XML files, with consideration of their size - as the size is an indicator for the runtime approximately needed. For optimizing the runtime of the export, the target is to start 3 export tasks in the system with similar workload. If all 3 export tasks need nearly the same runtime - and finish at nearly the same time - the balancing is quite good.

In the specified directory we now will find the 3 files:
Generated XML files

Now we need to specify the connection to be used (<DefaultConnection>) in the 3 XML files to finish them and make them usable for the export process.

Of course, in front of the export some preparations are to be done. Beside the planning and announcement, before starting the export the users are to be locked and logged off, batch jobs are to be suspended etc.  This has to be done in advance, and we can do this similar to the export - using the command line tool with an XML file. You will find an example for this in the "XML" folder of our product.

Now, that we are using tasks in parallel, the start of the prework processing (including user locking, suspending batch jobs etc. and the 3 export tasks), it is not only a single call of the "Shortcut for SAP systems" command line tool. First the user locking stuff has to be done. After finishing this the 3 export tasks are to be started in parallel. This makes it a little bit more complicated. Starting tasks in parallel on OS level is not a big issue on a Windows computer, but we also want to know when the tasks have been finished. Thinking about automation, after finishing the export of all data, the system copy tasks - including the database copy - are to be started.

Have a look at this command file "main.cmd", executing the prework of the system refresh, which uses a Powershell script that controls start and end of the parallel tasks::
Command file for systemrefresh prework

First the command line tool is called with the XML file containing user locking, suspending batch jobs etc.
After this has been done, a Powershell script "StartTasksAndWait.ps1" is executed. You can find this script as part of all files used in this article here or supplied with our product (folder xml, file "Example_Systemrefresh_runtime_optimized.zip".
It works as following:
  • it reads a file "ParallelTasks.txt" that contains the calls of the command line tool with the XML files for the export
  • for each line an asynchronous task is started
  • it periodically checks (default: every 5 seconds) whether the tasks are still active
  • after all tasks have been finished, the Powershell script ends.
So in addition to this we have to fill the file "ParallelTasks.txt":
A file with the tasks to be executed in parallel

The single command files for the export tasks are just calls of the command line tool with the XML files for the export:
The single command files of the parallel tasks

It may seem a bit complicated, but it is not. Once setup, you can use it again and again.

Finally we have a single command file, executable on OS level, automatable with any suitable tool, doing the complete prework including user locking, suspending batch jobs etc. and executing the export in parallel tasks in a balanced manner, saving much runtime!

After starting "QX1_Systemrefresh_Export.cmd" the procedure starts, and if you arrange the 4 raising console windows, it will look similar to this:
Running system refresh with parallel tasks

The upper left window (1) shows the console for "main.cmd", the other windows are the 3 tasks for the export, started in parallel. In window (1) the tasks (windows (2), (3), (4)) will be checked perodically. Once all tasks have been finished the prework of the system refresh is done - much faster than executing the exports one after the other in sequential order! And the same method can be used also for the cleanup and the import phase.

Find all files necessary for the pre- and postprocessing of a system refresh supplied with our product (folder xml, file "Example_Systemrefresh_runtime_optimized.zip"), easily adoptable to your needs. Alternatively you can download the ZIP file here.




Es gibt noch keine Rezension.
0
0
0
0
0
Shortcut IT GmbH
Försterstr. 11A
31275 Lehrte

Zurück zum Seiteninhalt