Do not input private or sensitive data. View Qlik Privacy & Cookie Policy.
Skip to main content

Suggest an Idea

Announcements
This page is no longer in use. To suggest an idea, please visit Browse and Suggest.

Column Collation Not Replicated with Parallel Load

Anonymous
Not applicable

Column Collation Not Replicated with Parallel Load

Hello,

On Microsoft Sql Server, our source database collation is SQL_Latin1_General_CP1_CI_AS however we have column collations that are SQL_Latin1_General_CP1_CS_AS. If we define the replicate task without any transformations, it maintains this column collation. However if we add transformations, the column collation changes to SQL_Latin1_General_CP1_CI_AS.

I can understand this limitation when the table is manipulated, either transformations to existing columns or adding columns that subsequently change the table definition. Unfortunately this issue also occurs when we select a table for parallel load. As we have some larger source tables requiring the case sensitive collation, it would help our full load if we could utilize the parallel load.

We submitted a case support asking about this issue and it was determined this limitation was by design, however this wasn't clearly indicated in the documentation (p. 284 of the handbook https://help.qlik.com/en-US/replicate/November2020/pdf/Replicate%20Setup%20and%20User%20Guide.pdf says "When replicating from one Microsoft SQL Server database to another, column and table collations will be replicated to the target. ")

Thank you!

Tags (2)
4 Comments
Shelley_Brennan
Former Employee
Former Employee

Could you share the support case number with us?  Thank you!

Status changed to: Open - New
Anonymous
Not applicable
Hello,
Thank you for your help with this issue. The case number we submitted is: 02088020
Thank you!
Shelley_Brennan
Former Employee
Former Employee

We are still looking into this on our side and hope to have another update for you soon.  Thanks!

Shelley_Brennan
Former Employee
Former Employee

We have analyzed this further and determined that this is indeed a bug.  We have reopened your support case 02088020 to handle this.  Thanks for your patience!

Status changed to: Closed - Archived