Do not input private or sensitive data. View Qlik Privacy & Cookie Policy.
Skip to main content

Announcements
Qlik Open Lakehouse is Now Generally Available! Discover the key highlights and partner resources here.
cancel
Showing results for 
Search instead for 
Did you mean: 
harsh2
Partner - Contributor III
Partner - Contributor III

Data type issue in Qlik replicateTask

My source is DB2 I Series 400, and my target is Google BigQuery.

So I am using the logstream method, in which, while replicating from the logstream db2 source to target, I am getting the following error, and the table gets queued.

error: 

Handling End of table 'ndp_staging'.'BNFYPF' loading failed by subtask 1 thread 1
Failed to load data from csv file.
Failed to wait for previous run
Command failed to load data with exit error code 1, Command output: 
Upload complete.

Waiting on bqjob_r60fdf4f4d67caf4d_0000018bddb8d4a4_1 ... (0s) Current status: RUNNING
                                                                                      
Waiting on bqjob_r60fdf4f4d67caf4d_0000018bddb8d4a4_1 ... (1s) Current status: RUNNING
                                                                                      
Waiting on bqjob_r60fdf4f4d67caf4d_0000018bddb8d4a4_1 ... (1s) Current status: DONE   
BigQuery error in load operation: Error processing job 'ifl-uat-dataplatform-
prj:bqjob_r60fdf4f4d67caf4d_0000018bddb8d4a4_1': Error while reading data, error
message: CSV processing encountered too many errors, giving up. Rows: 83183;
errors: 1; max bad: 0; error percent: 0
Failure details:
- Error while reading data, error message: 'attNULL' is null for
required field; line_number: 1176 byte_offset_to_start_of_line:
273357 column_index: 5 column_name: "CURRFROM" column_type: NUMERIC
value: "attNULL"
- Error while reading data, error message: 'attNULL' is null for
required field; line_number: 1176 byte_offset_to_start_of_line:
273357 column_index: 6 column_name: "CURRTO" column_type: NUMERIC
value: "attNULL"
- Error while reading data, error message: 'attNULL' is null for
required field; line_number: 1176 byte_offset_to_start_of_line:
273357 column_index: 9 column_name: "BNYPC" column_type: NUMERIC
value: "attNULL"
- You are loading data without specifying data format, data will be
treated as CSV format by default. If this is not what you mean,
please specify data format by --source_format.

 

1 Solution

Accepted Solutions
harsh2
Partner - Contributor III
Partner - Contributor III
Author

We found that some columns on the source side have values Qlik Replicate sees as null. These columns are set as not null on the target side, causing issues. To fix this, we suggest replacing nulls with blank spaces in Qlik Replicate:

  1. Go to Task Designer mode.
  2. In Global Rules or specific tables, go to Transform.
  3. Add a new transformation using "Replace column value."
  4. Add this to the expression:
    ifnull($AR_M_SOURCE_COLUMN_DATA,' ')

View solution in original post

9 Replies
john_wang
Support
Support

Hello @harsh2 ,

Thanks for reaching out to Qlik Community!

First of all, LogStream is not mandatory method in a DB2i to Google BigQuery replication task, certainly you may use it for some special purpose.

When it comes to your error, we may check if the column(s) value are NULL in the interim CSV file by set keepCSVFiles & keepErrorFiles to TRUE:

john_wang_0-1700287648919.png

 

Repeat the error then check the generated CSV files (default location "C:\Program Files\Attunity\Replicate\data\tasks\<taskName>\data_files\"). Let's focus on a column, for example, CURRFROM, please confirm if it's nullable in target table, and if the values in the CSV files are NULL (presents by "attNULL").

 

Hope this helps.

John.

Help users find answers! Do not forget to mark a solution that worked for you! If already marked, give it a thumbs up!
harsh2
Partner - Contributor III
Partner - Contributor III
Author

Hi  @john_wang 

I tried the internal parameter.

still got that same error.
and on this path, "C:\Program Files\Attunity\Replicate\data\tasks\<taskName>\data_files\"
There wasn't any csv file present, but on this path, "C:\Program Files\Attunity\Replicate\data\tasks\<taskName>\error_files\" There was one LOAD00000001.csv file, and in that csv file, there wasn't any column header, so I didn't get to know which column contained null; some values were null (like "").

About target, the CURRFROM column is not nullable.

Thanks & Regards
Harsh Patel

 

john_wang
Support
Support

Hello @harsh2 ,

Thanks for the update.

The 2 internal parameters are not supposed to solve the problem, it's for troubleshooting only. Would you please open a support case and also:

1. Set SOURCE_CAPTURE/TARGET_APPLY to Verbose in the task. Repeat the error and upload the Diag Packages to the ticket

2. Decrypt the task log files, see the steps, then upload the decrypted task log file

3. Attached the generated CSV files.

Support team will help you further on this issue.

Regards,

John.

Help users find answers! Do not forget to mark a solution that worked for you! If already marked, give it a thumbs up!
harsh2
Partner - Contributor III
Partner - Contributor III
Author

We found that some columns on the source side have values Qlik Replicate sees as null. These columns are set as not null on the target side, causing issues. To fix this, we suggest replacing nulls with blank spaces in Qlik Replicate:

  1. Go to Task Designer mode.
  2. In Global Rules or specific tables, go to Transform.
  3. Add a new transformation using "Replace column value."
  4. Add this to the expression:
    ifnull($AR_M_SOURCE_COLUMN_DATA,' ')
harsh2
Partner - Contributor III
Partner - Contributor III
Author

 Hi  @john_wang 

This happens because not-null columns have garbage values on the source side, but I am facing an error in one table that rrn has nulls.

How to handle this situation ?

I loaded the same table in another task and it loaded successfully, but in my task it is in an error state with the following error:

Error :

Handling End of table 'ndp_staging'.'CLDEPF' loading failed by subtask 1 thread 1
Failed to load data from csv file.
Failed to wait for previous run
Command failed to load data with exit error code 1, Command output:
Upload complete.

Waiting on bqjob_r7705add47276152b_0000018c10e2537a_1 ... (0s) Current status: RUNNING

Waiting on bqjob_r7705add47276152b_0000018c10e2537a_1 ... (0s) Current status: DONE
BigQuery error in load operation: Error processing job 'ifl-uat-dataplatform-
prj:bqjob_r7705add47276152b_0000018c10e2537a_1': Error while reading data, error
message: CSV table encountered too many errors, giving up. Rows: 100; errors:
100. Please look into the errors[] collection for more details.
Failure details:
- Error while reading data, error message: CSV processing encountered
too many errors, giving up. Rows: 100; errors: 100; max bad: 0;
error percent: 0
- Error while reading data, error message: 'attNULL' is null for
required field; line_number: 1 byte_offset_to_start_of_line: 0
column_index: 8 column_name: "RRN" column_type: INT64 value:
"attNULL"
- Error while reading data, error message: 'attNULL' is null for
required field; line_number: 2 byte_offset_to_start_of_line: 170
column_index: 8 column_name: "RRN" column_type: INT64 value:
"attNULL"
- Error while reading data, error message: 'attNULL' is null for
required field; line_number: 3 byte_offset_to_start_of_line: 337
column_index: 8 column_name: "RRN" column_type: INT64 value:
"attNULL"
- Error while reading data, error message: 'attNULL' is null for
required field; line_number: 4 byte_offset_to_start_of_line: 507
column_index: 8 column_name: "RRN" column_type: INT64 value:
"attNULL"
- Error while reading data, error message: 'attNULL' is null for
required field; line_number: 5 byte_offset_to_start_of_line: 677
column_index: 8 column_name: "RRN" column_type: INT64 value:
"attNULL"
- Error while reading data, error message: 'attNULL' is null for
required field; line_number: 6 byte_offset_to_start_of_line: 847
column_index: 8 column_name: "RRN" column_type: INT64 value:
"attNULL"
- Error while reading data, error message: 'attNULL' is null for
required field; line_number: 7 byte_offset_to_start_of_line: 1017
column_index: 8 column_name: "RRN" column_type: INT64 value:
"attNULL"
- Error while reading data, error message: 'attNULL' is null for
required field; line_number: 8 byte_offset_to_start_of_line: 1187
column_index: 8 column_name: "RRN" column_type: INT64 value:
"attNULL"
- Error while reading data, error message: 'attNULL' is null for
required field; line_number: 9 byte_offset_to_start_of_line: 1357
column_index: 8 column_name: "RRN" column_type: INT64 value:
"attNULL"
- Error while reading data, error message: 'attNULL' is null for
required field; line_number: 10 byte_offset_to_start_of_line: 1527
column_index: 8 column_name: "RRN" column_type: INT64 value:
"attNULL"
- Error while reading data, error message: 'attNULL' is null for
required field; line_number: 11 byte_offset_to_start_of_line: 1697
column_index: 8 column_name: "RRN" column_type: INT64 value:
"attNULL"
- Error while reading data, error message: 'attNULL' is null for
required field; line_number: 12 byte_offset_to_start_of_line: 1867
column_index: 8 column_name: "RRN" column_type: INT64 value:
"attNULL"
- Error while reading data, error message: 'attNULL' is null for
required field; line_number: 13 byte_offset_to_start_of_line: 2037
column_index: 8 column_name: "RRN" column_type: INT64 value:
"attNULL"
- Error while reading data, error message: 'attNULL' is null for
required field; line_number: 14 byte_offset_to_start_of_line: 2207
column_index: 8 column_name: "RRN" column_type: INT64 value:
"attNULL"
- Error while reading data, error message: 'attNULL' is null for
required field; line_number: 15 byte_offset_to_start_of_line: 2374
column_index: 8 column_name: "RRN" column_type: INT64 value:
"attNULL"
- Error while reading data, error message: 'attNULL' is null for
required field; line_number: 16 byte_offset_to_start_of_line: 2544
column_index: 8 column_name: "RRN" column_type: INT64 value:
"attNULL"
- Error while reading data, error message: 'attNULL' is null for
required field; line_number: 17 byte_offset_to_start_of_line: 2714
column_index: 8 column_name: "RRN" column_type: INT64 value:
"attNULL"
- Error while reading data, error message: 'attNULL' is null for
required field; line_number: 18 byte_offset_to_start_of_line: 2884
column_index: 8 column_name: "RRN" column_type: INT64 value:
"attNULL"
- Error while reading data, error message: 'attNULL' is null for
required field; line_number: 19 byte_offset_to_start_of_line: 3054
column_index: 8 column_name: "RRN" column_type: INT64 value:
"attNULL"
- Error while reading data, error message: 'attNULL' is null for
required field; line_number: 20 byte_offset_to_start_of_line: 3224
column_index: 8 column_name: "RRN" column_type: INT64 value:
"attNULL"

Thanks & regards,

Harsh Patel

SumitSingh
Partner - Contributor III
Partner - Contributor III

Hi @john_wang 

I'm also facing the same issue, can you please help.

Thanks!

Sumit

 

john_wang
Support
Support

Hello @harsh2 , @SumitSingh ,

Do you mean the RRN values are null? It's so strange. Relative Record Number (RRN), which is an internal identifier assigned to each record in a physical file in DB400, it's impossible to be null (or empty, or zero). Just like ROWID in Oracle world.

Do you mind sharing what's the RRN setting in DB2 source endpoint? A sample:

john_wang_0-1701182255456.png

thanks,

John.

 

 

Help users find answers! Do not forget to mark a solution that worked for you! If already marked, give it a thumbs up!
harsh2
Partner - Contributor III
Partner - Contributor III
Author

Hello @john_wang 

As we are loading LACTDTA.CLDEPF (which is the table got error in our task)

table in other task it is loaded successfully but for that particular task it is giving error

(BTW we have use same source and target endpoint connection for other task

RRN setting in DB2 source endpoint:

harsh2_0-1701183163227.png

Changing Settings of endpoint will affect on our replicate task ? (optional) 

Thanks & regards 

Harsh Patel

john_wang
Support
Support

Hello @harsh2 ,

Thanks for the follow up.

Yes of course changing the endpoint settings will affect the Replicate Task running. However it's hard to tell why the table works in one task but not in another task. Please open a support ticket and attach the 2 tasks Diag Packages for comparison.

Thanks,

John.

Help users find answers! Do not forget to mark a solution that worked for you! If already marked, give it a thumbs up!