Do not input private or sensitive data. View Qlik Privacy & Cookie Policy.
Skip to main content

Announcements
Join us to spark ideas for how to put the latest capabilities into action. Register here!
cancel
Showing results for 
Search instead for 
Did you mean: 
Rathesh
Contributor
Contributor

Loading data from DB into CSV with special characters and then load it into Azure using bulk load

Hello,

I am trying to load a load a table from on-prem SQL server to Azure SQL Server(single instance) DB using bulk load. Source table is having 20 millions records. My job flow is tDBInput --> tMap --> tFileOutputDelimited --> tAzureStoragePut --> tDBRow. I have used bulk insert script inside the tDBRRow component.

   BULK INSERT dim.table1

   FROM 'bulkexecfiles/table1.txt'

   WITH ( DATA_SOURCE = 'BlobStorageAccount',

    FIRSTROW = 2,

    FIELDTERMINATOR = ',',

    ROWTERMINATOR = '0x0a',

    KEEPIDENTITY

   );

I have the below issues

  1. Source data is having lot of special characters(including new lines), so while loading into txt file I am not able to use any of the separators.
  2. If I use text enclosure while writing into file, Bulk insert is failing.

Can anyone suggest how to process this data. Thanks in advance

0 Replies