Do not input private or sensitive data. View Qlik Privacy & Cookie Policy.
Skip to main content

Announcements
Join us at Qlik Connect 2026 in Orlando, April 13–15: Register Here!
cancel
Showing results for 
Search instead for 
Did you mean: 
datnguyen
Contributor II
Contributor II

Qlik Replicate- Big Query bulk insert statement failed due to bad data.

Good morning,

We are using Qlik Replicate to pull data from iSeries to BigQuery. There is a "bulk insert" error occasionally occur made tasks fail. After resuming  tasks using "Resume processing..." in the web GUI, the tasks keep running fine until the same error occur some days later.  That cycle repeats again and again.

The tables in BigQuery still populated and look good but the error keep pushing up every day raise our concern.

Please advise.

[Details]

In this example, we have GQSUDC table (same name in iSeries and BigQuery) and GQSUDC__ct (change tracking version).  Checking the log file in Qlik replicate, we learnt that the error occurred when Qlik insert back data from attrep_changes table to GQSUDC__ct. There is a "invalid timestamp" error (red highlight in below error message).

 

Error message from log file:

Failed (retcode -1) to execute statement: 'INSERT INTO `quotedb_raw`.`GQSUDC__ct` (`header__change_seq`,`header__change_oper`,`header__change_mask`,`header__stream_position`,`header__operation`,`header__transaction_id`,`COMPNUM`,`POLNUM`,`SDUSER`,`SDCODE`,`SDSTAT`,`SDACTY`,`SDRRPL`,`SDRTAP`,`SDGRP`,`SDDISPLAY`,`SDVERIFY`,`SDASSN`,`SDTABL`,`SDSDTY`,`TRANSCTION_TYPE_FLAG`,`SDUNIT`,`SDDRVR`,`SDDEEF`,`SDDENT`,`SDRDIS`,`SDDAMT`,`SDPAMT`,`SDPRDS`,`header__timestamp`,`SDSTMP`,`DATA_EXTRACT_DATETM`,`DATA_LOAD_DATETM`,`RRN`) SELECT CAST( `attrep_changes789B3994BFC808E3`.`col1` as STRING) , CAST( `attrep_changes789B3994BFC808E3`.`col2` as STRING) , CAST( `attrep_changes789B3994BFC808E3`.`col3` as STRING) , CAST( `attrep_changes789B3994BFC808E3`.`col4` as STRING) , CAST( `attrep_changes789B3994BFC808E3`.`col5` as STRING) , CAST( `attrep_changes789B3994BFC808E3`.`col6` as STRING) , CAST( `attrep_changes789B3994BFC808E3`.`col7` as STRING) , CAST( `attrep_changes789B3994BFC808E3`.`col8` as STRING) , CAST( `attrep_changes789B3994BFC808E3`.`col9` as STRING) , CAST( `attrep_changes789B3994BFC808E3`.`col10` as STRING) , CAST( `attrep_changes789B3994BFC808E3`.`col11` as STRING) , CAST( `attrep_changes789B3994BFC808E3`.`col12` as STRING) , CAST( `attrep_changes789B3994BFC808E3`.`col13` as STRING) , CAST( `attrep_changes789B3994BFC808E3`.`col14` as STRING) , CAST( `attrep_changes789B3994BFC808E3`.`col15` as STRING) , CAST( `attrep_changes789B3994BFC808E3`.`col16` as STRING) , CAST( `attrep_changes789B3994BFC808E3`.`col17` as STRING) , CAST( `attrep_changes789B3994BFC808E3`.`col18` as STRING) , CAST( `attrep_changes789B3994BFC808E3`.`col19` as STRING) , CAST( `attrep_changes789B3994BFC808E3`.`col20` as STRING) , CAST( `attrep_changes789B3994BFC808E3`.`col21` as STRING) , CAST( `attrep_changes789B3994BFC808E3`.`col22` as NUMERIC) , CAST( `attrep_changes789B3994BFC808E3`.`col23` as NUMERIC) , CAST( `attrep_changes789B3994BFC808E3`.`col24` as NUMERIC) , CAST( `attrep_changes789B3994BFC808E3`.`col25` as NUMERIC) , CAST( `attrep_changes789B3994BFC808E3`.`col26` as NUMERIC) , CAST( `attrep_changes789B3994BFC808E3`.`col27` as NUMERIC) , CAST( `attrep_changes789B3994BFC808E3`.`col28` as NUMERIC) , CAST( `attrep_changes789B3994BFC808E3`.`col29` as NUMERIC) , CAST( `attrep_changes789B3994BFC808E3`.`col30` as TIMESTAMP) , CAST( `attrep_changes789B3994BFC808E3`.`col31` as TIMESTAMP) , CAST( `attrep_changes789B3994BFC808E3`.`col32` as TIMESTAMP) , CAST( `attrep_changes789B3994BFC808E3`.`col33` as TIMESTAMP) , CAST( `attrep_changes789B3994BFC808E3`.`col34` as INT64) FROM `quotedb_raw`.`attrep_changes789B3994BFC808E3` WHERE `attrep_changes789B3994BFC808E3`.`seq` >= ? and `attrep_changes789B3994BFC808E3`.`seq` <= ?' [1022502] (ar_odbc_stmt.c:2783)
00005539: 2022-06-30T01:28:30 [TARGET_APPLY ]E: RetCode: SQL_ERROR SqlState: 42000 NativeError: 70 Message: [Simba][BigQuery] (70) Invalid query: Invalid timestamp string "0000-22-00 00:00:00.000000" [1022502] (ar_odbc_stmt.c:2789)

 

Thanks,

Dat

Labels (1)
2 Solutions

Accepted Solutions
Steve_Nguyen
Support
Support

do you the error on the same timestamp  "0000-22-00 00:00:00.000000" ?

 

if so, you may want to do a transformation .

 

https://community.qlik.com/t5/Knowledge/Transformation-Date-Time-Invalid-Value-MySQL/ta-p/1826202

Help users find answers! Don't forget to mark a solution that worked for you! If already marked, give it a thumbs up!

View solution in original post

datnguyen
Contributor II
Contributor II
Author

Thanks @john_wang  for your input.

We've already isolated the table in a separated task and use VERBOSE log level to debug the error. We could narrow down the column that cause the issue and try the table-level transformation as below:

CASE $SDSTMP 
	WHEN '0001-01-01 00:00:00' THEN '1900-01-01 00:00:00' 
	WHEN '0000-22-00 00:00:00.000000' THEN '1900-01-01 00:00:00'
	ELSE STRFTIME('%Y-%m-%d %H:%M:%f', $SDSTMP, "UTC") 
END

It worked well so far. We keep monitoring for a few more days.

View solution in original post

6 Replies
Steve_Nguyen
Support
Support

do you the error on the same timestamp  "0000-22-00 00:00:00.000000" ?

 

if so, you may want to do a transformation .

 

https://community.qlik.com/t5/Knowledge/Transformation-Date-Time-Invalid-Value-MySQL/ta-p/1826202

Help users find answers! Don't forget to mark a solution that worked for you! If already marked, give it a thumbs up!
datnguyen
Contributor II
Contributor II
Author

Thanks for your reply. We narrowed down the issue is on columns with TIMESTAMP type and we have local transformation for each TIMESTAMP field in table setting.

I will try the global transformation approach suggested in the given link.

Thank you.

datnguyen
Contributor II
Contributor II
Author

Updates: 

I tried the solution in Steve's link: https://community.qlik.com/t5/Knowledge/Transformation-Date-Time-Invalid-Value-MySQL/ta-p/1826202 and let the task run for several days. The same error occur.

Also, my Qlik Replicate version (7.0.0.356) doesn't support replace value in column in  global transformation. I did the transform at table-level transformation in "Table Settings".

Still looking for the solution.

john_wang
Support
Support

Hello @datnguyen ,

looks to me this is a random data error, how about we find out which bad row cause the error? I'd like to suggest:

1. New a test task which contains the 'problematic' table (GQSUDC) only, set source_capture to Verbose

2. Set the task log file "Automatic Rollover" to limit the log file size to 2G

     Set the task log file "Automatic Cleanup" and make sure the Replicate task server have enough free spaces to store the huge log files. And/or set "Resource Control" --> "Disk Space" to make sure the test task will not impact the PROD system.

3. after the problem happens, save the existing task log files;

     "Resume" the task again, and save the new task log files.

      Compare the 2 set of log files to find out which line (by PK value, or Change Seq number, or RRN which can position the unique row). And compare the change row columns values with DSPJRN result to fully understand why the problem happens.

4. (Optional) install higher version Replicate eg 2021.11 on a test server to run the same replication job for comparison. 

Regards,

John.

 

Help users find answers! Do not forget to mark a solution that worked for you! If already marked, give it a thumbs up!
datnguyen
Contributor II
Contributor II
Author

Thanks @john_wang  for your input.

We've already isolated the table in a separated task and use VERBOSE log level to debug the error. We could narrow down the column that cause the issue and try the table-level transformation as below:

CASE $SDSTMP 
	WHEN '0001-01-01 00:00:00' THEN '1900-01-01 00:00:00' 
	WHEN '0000-22-00 00:00:00.000000' THEN '1900-01-01 00:00:00'
	ELSE STRFTIME('%Y-%m-%d %H:%M:%f', $SDSTMP, "UTC") 
END

It worked well so far. We keep monitoring for a few more days.

Dana_Baldwin
Support
Support

Hi @datnguyen 

If one of the replies helped to resolve the issue, please mark it as a solution & like it to make it easier for others to find.
Thanks in advance!

Dana