Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 
SaranyaK
Contributor III
Contributor III

Task Error post qlik upgrade which caused huge latency

 

Task to load from SQL server to snowflake is giving below error. We are getting error after resume post upgrade (6.6 to 7.0).  Incremental is set to "prioritize online logs"

 

 

EWBtoDLAll replication task encountered the following error:

Stream component 'st_0_DL-EWB' terminated Stream component failed at subtask 0, component st_0_DL-EWB Error executing command Failed to copy data to net changes table Failed to load data of file F:\Program Files\Attunity\Replicate\data\tasks\EWBtoDLAll\cloud\bulk\CDC00000001.csv to database Failed to load EWB.attrep_changes9D0775B30A050F0E from stage, file name: CDC00000001.csv

RetCode: SQL_ERROR  SqlState: 22000 NativeError: 100016 Message: Field delimiter ',' found while expecting record delimiter '\n'

  File 'e65d4932_f3d6_0149_9ceb_c9ff8d89d4a9/0/CDC00000001.csv.gz', line 467443, character 1585

  Row 454831, column ""attrep_changes9D0775B30A050F0E""["seg2":126]

  If you would like to continue loading when an error is encountered, use other values such as 'SKIP_FILE' or 'CONTINUE' for the ON_ERROR option. For more information on loading options, please run 'info loading_data' in a SQL client.

RetCode: SQL_ERROR  SqlState: 22000 NativeError: 100016 Message: Field delimiter ',' found while expecting record delimiter '\n'

  File 'e65d4932_f3d6_0149_9ceb_c9ff8d89d4a9/0/CDC00000001.csv.gz', line 467443, character 1585

  Row 454831, column ""attrep_changes9D0775B30A050F0E""["seg2":126]

  If you would like to continue loading when an error is encountered, use other values such as 'SKIP_FILE' or 'CONTINUE' for the ON_ERROR option. For more information on loading options, please run 'info loading_data' in a SQL client.

RetCode: SQL_ERROR  SqlState: 22000 NativeError: 100016 Message: Field delimiter ',' found while expecting record delimiter '\n'

  File 'e65d4932_f3d6_0149_9ceb_c9ff8d89d4a9/0/CDC00000001.csv.gz', line 467443, character 1585

  Row 454831, column ""attrep_changes9D0775B30A050F0E""["seg2":126]

  If you would like to continue loading when an error is encountered, use other values such as 'SKIP_FILE' or 'CONTINUE' for the ON_ERROR option. For more information on loading options, please run 'info loading_data' in a SQL client.

RetCode: SQL_ERROR  SqlState: 22000 NativeError: 100016 Message: Field delimiter ',' found while expecting record delimiter '\n'

  File 'e65d4932_f3d6_0149_9ceb_c9ff8d89d4a9/0/CDC00000001.csv.gz', line 467443, character 1585

  Row 454831, column ""attrep_changes9D0775B30A050F0E""["seg2":126]

  If you would like to continue loading when an error is encountered, use other values such as 'SKIP_FILE' or 'CONTINUE' for the ON_ERROR option. For more information on loading options, please run 'info loading_data' in a SQL client.

RetCode: SQL_ERROR  SqlState: 22000 NativeError: 100016 Message: Field delimiter ',' found while expecting record delimiter '\n'

  File 'e65d4932_f3d6_0149_9ceb_c9ff8d89d4a9/0/CDC00000001.csv.gz', line 467443, character 1585

  Row 454831, column ""attrep_changes9D0775B30A050F0E""["seg2":126]

  If you would like to continue loading when an error is encountered, use other values such as 'SKIP_FILE' or 'CONTINUE' for the ON_ERROR option. For more information on loading options, please run 'info loading_data' in a SQL client.

RetCode: SQL_ERROR  SqlState: 22000 NativeError: 100016 Message: Field delimiter ',' found while expecting record delimiter '\n'

  File 'e65d4932_f3d6_0149_9ceb_c9ff8d89d4a9/0/CDC00000001.csv.gz', line 467443, character 1585

  Row 454831, column ""attrep_changes9D0775B30A050F0E""["seg2":126]

  If you would like to continue loading when an error is encountered, use other values such as 'SKIP_FILE' or 'CONTINUE' for the ON_ERROR option. For more information on loading options, please run 'info loading_data' in a SQL client..

------------------------------------------------------------------

This is an automated message generated by Qlik Replicate server AZUVPATTEDWL001.corp.firstam.com for notification EWB-Errors.

 

 

 

Labels (1)
1 Solution

Accepted Solutions
lyka
Support
Support

Hello!

The may be caused by a data issue and replicate is parsing it incorrectly and that is why when its loaded to the target table it gets a conversion error:

RetCode: SQL_ERROR  SqlState: 22000 NativeError: 100016 Message: Field delimiter ',' found while expecting record delimiter '\n'

Please try this:

On your target endpoint add the internal parameter $info.query_syntax.csv_delimiter
hit enter and set value to #$QLIK$#

instead of a comma, it will use $QLIK$ as a delimiter, you can use other delimiter as long as you need it will not be an actual record from the source.

 

Additional info:

In older versions of Replicate, the external file format is not automatically dropped. 

when we start a task, we create a external file format and an external file source. when a task is stopped and resumed due to an error, the underlying file format and data source will not be dropped.

to handle this, get the external file source from the task log and manually drop it and the equivalent data source in the target database.

Hope this helps!

 

Thanks

Lyka

View solution in original post

4 Replies
Heinvandenheuvel
Specialist II
Specialist II

My guess is that this has nothing to do with the upgrade as such.

What is the current status? Re-connecting the target to retry the reload over and over?

Did it work at all afterwards? Immediate failure after upgrade or some updates done?

Maybe the upgrade caused a processing delay which in turn caused a larger CSV file but that shouldn't matter.

To study this I would study the CSV file in detail. The error suggests that the CSV interpreter thought all the data was there and a new-line was expected, but there was more data. This can (but should not) happen when a 'naked' column delimiter (comma) came through. Count commas in the indicated lines!

If the CSV file is not there, you may have to telll it to stay using the keep_csv_files setting (or whatever that's called for Snowflake).

Can you figure out which target table (probably not) while studying the reptask log?

If se you may want to exclude is; start by timestamp;  stop after catching up; re-add suspect table(s); and resume to  and reload those tables. Or just reload all if that's reasonable.

Hein

 

lyka
Support
Support

Hello!

The may be caused by a data issue and replicate is parsing it incorrectly and that is why when its loaded to the target table it gets a conversion error:

RetCode: SQL_ERROR  SqlState: 22000 NativeError: 100016 Message: Field delimiter ',' found while expecting record delimiter '\n'

Please try this:

On your target endpoint add the internal parameter $info.query_syntax.csv_delimiter
hit enter and set value to #$QLIK$#

instead of a comma, it will use $QLIK$ as a delimiter, you can use other delimiter as long as you need it will not be an actual record from the source.

 

Additional info:

In older versions of Replicate, the external file format is not automatically dropped. 

when we start a task, we create a external file format and an external file source. when a task is stopped and resumed due to an error, the underlying file format and data source will not be dropped.

to handle this, get the external file source from the task log and manually drop it and the equivalent data source in the target database.

Hope this helps!

 

Thanks

Lyka

Harish6
Contributor
Contributor

Hi Lyka, 

I have a similar issue where I am trying to find a internal parameter to fix it.

I am providing the error below, please go through it and suggest me a solution for this.

Handling End of table 'VMACIcubed_DATA'.'icubed_document' loading failed by subtask 1 thread 1
Failed to copy data of file D:\Attunity\Replicate\data\tasks\AIGI_EDR_VMAC_ED-VMAC-REPL_FAS_FL_STest\cloud\7\LOAD00000001.csv to database
Failed to load VMACIcubed_DATA.icubed_document from stage, file name: LOAD00000001.csv
RetCode: SQL_ERROR SqlState: 22000 NativeError: 100072 Message: NULL result in a non-nullable column
File '7/LOAD00000001.csv.gz', line 1833706, character 1
Row 1833689, column ""icubed_document""["email_id":6]
If you would like to continue loading when an error is encountered, use other values such as 'SKIP_FILE' or 'CONTINUE' for the ON_ERROR option. For more information on loading options, please run 'info loading_data' in a SQL client.
Failed (retcode -1) to execute statement: 'COPY INTO "VMACIcubed_DATA"."icubed_document"("id", "created", "modified", "description", "filename", "email_id", "type", "is_deleted", "submission_id", "url", "parse_results", "preview_state", "thumbnail_aspect", "thumbnail_filename", "xls_json_data_filename", "document_as_pdf_filename", "originating_s3_bucket", "version", "imageright_status", "imageright_document_id", "viki_id", "viki_imageright_status", "viki_imageright_sync_failure_datetime", "viki_imageright_sync_failure_message", "viki_imageright_sync_failure_retry_count", "archlink_proxy_imageright_status", "archlink_proxy_imageright_action_reason", "___SRC_DB_CMT_TS", "___SRC_DB_CHNG_USR", "___SRC_SYST_NM", "___SRC_SYST_CD", "___REPL_TS_UTC") FROM (select $1, $2, $3, $4, $5, $6, $7, $8, $9, $10, $11, $12, $13, $14, $15, $16, $17, $18, $19, $20, $21, $22, $23, $24, $25, $26, $27, $28, $29, $30, $31, $32 FROM '@"AIGI_EDR_REPLICATEDDATA"."PUBLIC"."ATTREP_IS_AIGI_EDR_REPLICATEDDATA_5dead0b9_49a9_1d43_b81a_52e26fe034fe"/7/') files = ('LOAD00000001.csv.gz')'
RetCode: SQL_ERROR SqlState: 22000 NativeError: 100072 Message: NULL result in a non-null

able column
File '7/LOAD00000001.csv.gz', line 1833706, character 1
Row 1833689, column ""icubed_document""["email_id":6]
If you would like to continue loading when an error is encountered, use other values such as 'SKIP_FILE' or 'CONTINUE' for the ON_ERROR option. For more information on loading options, please run 'info loading_data' in a SQL client.

Steve_Nguyen
Support
Support

1. make sure that your snowflake odbc version is 2.25.xx we have seen similar error on older snowflake odbc version.

 

2. for the "NULL result in a non-null able column" do you have parallel  segment on tables, if so try to remove the parallel segment loading. and reload task.

 

3.

QLIK is interpreting a zero length string as NULL. Open the table from the bottom right of the Designer screen. Select Transform. On the screen that opens, click the Expression field for the column. Enter: CASE $ColumnName WHEN null Then ' ' Else $ColumnName END Actually this might work just as well: ifnull($ColumnName,' ')

 

 

Help users find answers! Don't forget to mark a solution that worked for you! If already marked, give it a thumbs up!