Skip to main content
Announcements
Introducing Qlik Answers: A plug-and-play, Generative AI powered RAG solution. READ ALL ABOUT IT!

How to Execute Master Data Delta Refreshes part 2 of 2

cancel
Showing results for 
Search instead for 
Did you mean: 
Troy_Raney
Digital Support
Digital Support

How to Execute Master Data Delta Refreshes part 2 of 2

Last Update:

Feb 14, 2024 3:44:39 AM

Updated By:

Troy_Raney

Created date:

Feb 5, 2024 4:35:25 AM

Troy_Raney_0-1707125675111.gif

 

Environment

  • Qlik Gold Client

 

Transcript

Hello and welcome. My name is Steve George and I'm glad that you're watching today. This is the second video in a two-part series about using Gold Client to perform master data delta refreshes. This video utilizes the Scenarios created in the first video, so please watch the two videos in sequence to best understand the content.

The agenda includes a brief overview, then focuses on how to initiate the data copy, including automation steps. Other available resources provided by Qlik are shared at the end.

This slide provides a reminder of the end-to-end process that was shared in the previous video. Where the first video focused on creating the various Scenarios, this video focuses on initiating the data copy process. Let's jump right in!

To reach the Gold Client solution, use t-code /n/HTG/ZGOLD; then select the Data Wave function.
Click the 'Select' checkbox for the respective Data Wave ID that was created previously.
Click the 'Import Options' button located on the toolbar.
In the window that appears, input the RFC assigned to the target environment.
Next, choose whether or not to select the 'Parallel Processing On Import' option.
Then choose whether to replace or skip conflicting data.
It is important to keep in mind that if your organization is targeting data which has been recently changed, then this option should be set to "Replace Conflicting Data". If the "Skip Conflicting Data" option is used, then the data which already exists in the target environment is unlikely to be refreshed. Consider discussing this important setting with your Applications team so that everyone is clear on the expected outcome.
Click the 'Accept' button when done. Notice the Data Wave ID now shows the RFC that will be used by the import process.
Click the 'Schedule Jobs' button located on the toolbar.
In the window that appears, do the following:
First, input a job name.
Then choose the frequency of execution; be sure to select Daily, Weekly or Monthly if you want the data copy to occur periodically.
Next, choose whether or not to use the 'Parallel Processing' option; the larger the data set being copied, the more it is recommended that this option be used.
Input the Start date and time for the initial execution.
Next, check the 'Run Date Utility' checkbox; this option is important because this means that the date ranges defined within the respective Scenarios will be updated accordingly.
Selecting this option is important because if not done, then the date ranges used in the various Scenarios will remain static and that means the periodic process will copy the same data over and over. This option must be set to move the date ranges forward each time the copy process is executed.
Then select the option to designate the number of days this utility looks back in time. Qlik advises using the setting that best aligns with the export's frequency; for example, if the export is scheduled to run weekly, then set this utility to be weekly, too.
Click the 'Accept' button when done.
Close the informational message that appears.
Open t-code SM37 to display this batch job. Freely use the 'Job Monitor' shortcut to open SM37 in a new SAPGUI session.
Confirm that the job's start date, time, and frequency are defined as expected.
You should see that two steps exist; the first step runs the Adjust Date Range Utility so that the dates used in the various Scenarios are updated first, and the second step runs the data export.
Here we can see the result after the job has been executed.
The spool written by the initial job shows that the date ranges were updated accordingly.
And the other spool shows the data which was exported.
Be aware that the job which finishes last will write a message near the end of the log indicating what batch job was created in the target environment that initiates the import process.
If needed, you could then monitor that import job to ensure it completes. Once done, this automated copy process is complete!

There are a couple of Gold Client user guides available that provide additional how-to content and they are located on Qlik's Help website. These resources are titled: How to Copy Master Data Using a Delta Refresh Process and Data Wave - User Guide.

Also, if the Auto-Import feature is new to your organization, be aware that a small amount of Gold Client configuration is required. Reference the content named Configure Auto Import RFC located within the Configuration and Utilities User Guide.

The various tasks needed for performing automated delta refresh copies are now complete. If you have questions, consider referencing the resources I have shared. If support is required, or if your team would like to vary from the process that was shared in this video, then kindly submit a support case and Qlik's Gold Client support staff will respond accordingly. Have a great day!

Contributors
Version history
Last update:
‎2024-02-14 03:44 AM
Updated by: