This Techspert Talks session addresses the following:
- How Replicate Works
- Understanding Log Files
- Quickly Resolving Issues
00:00 - Intro
01:02 - What Qlik Replicate does
01:41 - How to create a task
05:02 - Troubleshooting Full Load error
06:06 - Analyzing the task logs
08:54 - Setting the correct schema in task settings
11:49 - Troubleshooting Change Data Capture (CDC)
13:38 - Solution in Mainframe
15:10 - Solution in Replicate
16:34 - Testing Data Capture Changes
18:03 - Q&A: How to do Replication without Primary Keys?
19:05 - Q&A: How to fix missing records?
19:58 - Q&A: What are best practices for Full Load and CDC?
21:03 - Q&A: Where is help for installing Replicate?
21:46 - Q&A: What about communication error?
22:49 - Q&A: Can Replicate have multiple tasks with same Source?
23:31 - Q&A: Where to find driver version info?
24:43 - Q&A: How to know the build number?
25:43 - Q&A: What is the N column in SQL error message?
26:38 - Q&A: Where is Kafka endpoint documentation?
27:28 - Q&A: How to find the table ID referenced in the logs?
28:16 - Q&A: What is restarting with Advanced Run Options?
28:50 - Q&A: What is Resume Processing?
29:12 - Q&A: Can you have multiple tasks pointing to the same target?
29:31 - Q&A: What log files are there?
30:21 - Q&A: Where to turn on Verbose logging?
31:19 - Q&A: How to get error-triggered verbose logging?
32:36 - Troubleshooting Recommendations
Q: Is it possible to convert to consumable data?
A: Further research is needed. Can you open a case and give an example so that we may thoroughly answer your question?
Q: Sometime log files will be missing? Have you ever had that kind of issue? Did customers ever report that?
A: Yes, there can be multiple reasons for missing database logs. Sometimes it is due to a situation where failovers occur and archive logs are not available on the other node, or in DB2 LUW they can become unavailable if not managed properly by the log management facility for archiving and de-archiving, in a timely fashion. Sometimes there have been IBM DB2 Patched needed to be applied. Definitely search the Community and if there are no solutions there then open a case and attach a Diagnostic Package. This article should be helpful: What information to provide when troubleshooting missing data?
Q: I don’t see the option to automatically enable data capture change from the Replicate side. Is that option only in db2 source endpoint?
A: Yes, this is fore DB2 Z/OS and DB2 LUW endpoint. N. Not for the DB2 iSeries endpoint. It is not in other Sources like Oracle or SQL Server.
Q: Hi, is there is an option to find out if Replicate was able to replicate a specific record based on specific key? Do we need to enable control tables for it?
A: The Source_Capture set to Verbose may help as well as other Components may need Verbose Logging. It really depends on the Source type. You can try setting DATA_RECORD to verbose. Unfortunately, it does not allow you to select by Key, but you turn on the Store Changes (in Task Settings) to help expose the data records and their keys and find it in the target for a table called “your schema.tablename__CT”
Q: I have a specific problem when I activate the CDC, it shows the error.
"Cannot initialize subtask. Failed while preparing stream component 'st_0_SqlServer2019-111'. Please add TRACE logging to the component that is showing that message.
RetCode: SQL_ERROR SqlState: 42000 NativeError: 20028 Message: [Microsoft][ODBC Driver 17 for SQL Server][SQL Server]The Distributor has not been installed correctly. Could not enable database for publishing. Line: 1 Column: -1 Failed (retcode -1) to execute statement: 'exec sp_replicationdboption @dbname = N'Northwind', @optname = N'publish', @value = N'true''
A: Yes this message is a SQL Server Message. I would first check with the SQL Server DBA on what the message means and search the Community for what other customers did to resolve the issue. Also, you can add TRACE logging to the component that is showing that message. Rerun the task and if it does not show enough information then open a case and attach the Diagnostic Package.
Q: I had a "snapshot too big" issue on one of my task when using Oracle as a source. Is replicate doing a snapshot of the table to help it keep track of the changes happening on the table during the full load ?
Q: I am seeing now and then some records get added into apply_expection table, how to debug the reason - any specific log level to be increased from Info.
A: YES, you can change the Target_apply to trace or verbose as needed BUT ONLY keep the extra logging on for a FEW MINUTES ONLY and then set it back to INFO when done. Then you can examine the contents of the attrep_apply_changes as it will have the INSERT/UPDATE/DELETE and the associated erro message. Depending on the error search the Community and ask your DBA, then if you cannot resolve it then open a case and attach the DIAGNOSTIC PACKAGE to the case.
Q: Is the troubleshooting process similar in cases where Logstream is used for CDC?
A: Yes, it is the exact same. Great question! View the logs search for the problem, add TRACE or VERBOSE logging only for a few minutes during the problem, then put the logging back to INFO. Search the community for the messages. If you cannot find the resolution and if it is a SQL error you can either Google it and discuss it with your DBA, or if needed, open the case and attach the Diagnostic Package to the case.
Q: Which material should we go through in order to prepare for the qlik replicate certification?
A: Review the User Guide extensively and peruse through the Community for How To’s. There are also some excellent courses on Learning.Qlik.com for ramping up on everything to do with Qlik Replicate
Q: When using DB2z as a source and Kafka as a target - we see that empty fields (not NULL fields - actual empty fields filled with blanks) at the source are not sent at all in the data message. Is there an option to force Replicate to send empty fields? We would like to be able to differentiate empty fields from NULL fields when reading the attunity messages in Kafka.
A: Yes, if you have DB2 z/OS going to Kafka, you should select the row of the table and see the row in HEX ( HEX ON ) . Do you see 40’s which are BLANKS(SPACES) in the field? If so, they can come through to the target. We would need your Diag Pkg, and CREATE TABLE DDL to reproduce it in-house. A transformation may or may not be needed. Please create a new case.
Q: Tasks can be exported in JSON, and then you can compare versions exported on a regular basis.
A: Yes, you can compare the versions of Replicate in the exported task. It will look like this:
Q: Barb, In your demo you mentioned you can re-run Full Load to grab the 8 records that you missed, if you do that does not duplicate records on your MS SQL Databse?
A: The Duplicate records can be easily handled by going in the task settings, then Error Handling, then Apply Conflicts. Change the following. “Change to task policy” . Then Duplicate key when applying INSERT: should become “UPDATE the existing target record” and No record found for applying an UPDATE: should become “INSERT the missing target record” Save the settings and reload the table.