<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Loading data from DB into CSV with special characters and then load it into Azure using bulk load in Talend Studio</title>
    <link>https://community.qlik.com/t5/Talend-Studio/Loading-data-from-DB-into-CSV-with-special-characters-and-then/m-p/2368033#M131345</link>
    <description>&lt;P&gt;Hello,&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;I am trying to load a load a table from on-prem SQL server to Azure SQL Server(single instance) DB using bulk load. Source table is having 20 millions records. My job flow is tDBInput --&amp;gt; tMap --&amp;gt; tFileOutputDelimited --&amp;gt; tAzureStoragePut --&amp;gt; tDBRow.  I have used bulk insert script inside the tDBRRow component.&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt; &amp;nbsp; &amp;nbsp;&lt;B&gt;BULK INSERT dim.table1&lt;/B&gt;&lt;/P&gt;&lt;P&gt; &amp;nbsp; &amp;nbsp;&lt;B&gt;FROM 'bulkexecfiles/table1.txt'&lt;/B&gt;&lt;/P&gt;&lt;P&gt; &amp;nbsp; &amp;nbsp;&lt;B&gt;WITH ( DATA_SOURCE = 'BlobStorageAccount',&lt;/B&gt;&lt;/P&gt;&lt;P&gt; &amp;nbsp; &amp;nbsp;&lt;B&gt;		FIRSTROW = 2,&lt;/B&gt;&lt;/P&gt;&lt;P&gt; &amp;nbsp; &amp;nbsp;&lt;B&gt;		FIELDTERMINATOR = ',',&lt;/B&gt;&lt;/P&gt;&lt;P&gt; &amp;nbsp; &amp;nbsp;&lt;B&gt;		ROWTERMINATOR = '0x0a',&lt;/B&gt;&lt;/P&gt;&lt;P&gt; &amp;nbsp; &amp;nbsp;&lt;B&gt;		KEEPIDENTITY&lt;/B&gt;&lt;/P&gt;&lt;P&gt; &amp;nbsp; &amp;nbsp;&lt;B&gt;);&lt;/B&gt;&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;I have the below issues&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;OL&gt;&lt;LI&gt;Source data is having lot of special characters(including new lines), so while loading into txt file I am not able to use any of the separators. &lt;/LI&gt;&lt;LI&gt;If I use text enclosure while writing into file,  Bulk insert is failing.&lt;/LI&gt;&lt;/OL&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;Can anyone suggest how to process this data. Thanks in advance&lt;/P&gt;</description>
    <pubDate>Fri, 15 Nov 2024 23:42:23 GMT</pubDate>
    <dc:creator>Rathesh</dc:creator>
    <dc:date>2024-11-15T23:42:23Z</dc:date>
    <item>
      <title>Loading data from DB into CSV with special characters and then load it into Azure using bulk load</title>
      <link>https://community.qlik.com/t5/Talend-Studio/Loading-data-from-DB-into-CSV-with-special-characters-and-then/m-p/2368033#M131345</link>
      <description>&lt;P&gt;Hello,&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;I am trying to load a load a table from on-prem SQL server to Azure SQL Server(single instance) DB using bulk load. Source table is having 20 millions records. My job flow is tDBInput --&amp;gt; tMap --&amp;gt; tFileOutputDelimited --&amp;gt; tAzureStoragePut --&amp;gt; tDBRow.  I have used bulk insert script inside the tDBRRow component.&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt; &amp;nbsp; &amp;nbsp;&lt;B&gt;BULK INSERT dim.table1&lt;/B&gt;&lt;/P&gt;&lt;P&gt; &amp;nbsp; &amp;nbsp;&lt;B&gt;FROM 'bulkexecfiles/table1.txt'&lt;/B&gt;&lt;/P&gt;&lt;P&gt; &amp;nbsp; &amp;nbsp;&lt;B&gt;WITH ( DATA_SOURCE = 'BlobStorageAccount',&lt;/B&gt;&lt;/P&gt;&lt;P&gt; &amp;nbsp; &amp;nbsp;&lt;B&gt;		FIRSTROW = 2,&lt;/B&gt;&lt;/P&gt;&lt;P&gt; &amp;nbsp; &amp;nbsp;&lt;B&gt;		FIELDTERMINATOR = ',',&lt;/B&gt;&lt;/P&gt;&lt;P&gt; &amp;nbsp; &amp;nbsp;&lt;B&gt;		ROWTERMINATOR = '0x0a',&lt;/B&gt;&lt;/P&gt;&lt;P&gt; &amp;nbsp; &amp;nbsp;&lt;B&gt;		KEEPIDENTITY&lt;/B&gt;&lt;/P&gt;&lt;P&gt; &amp;nbsp; &amp;nbsp;&lt;B&gt;);&lt;/B&gt;&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;I have the below issues&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;OL&gt;&lt;LI&gt;Source data is having lot of special characters(including new lines), so while loading into txt file I am not able to use any of the separators. &lt;/LI&gt;&lt;LI&gt;If I use text enclosure while writing into file,  Bulk insert is failing.&lt;/LI&gt;&lt;/OL&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;Can anyone suggest how to process this data. Thanks in advance&lt;/P&gt;</description>
      <pubDate>Fri, 15 Nov 2024 23:42:23 GMT</pubDate>
      <guid>https://community.qlik.com/t5/Talend-Studio/Loading-data-from-DB-into-CSV-with-special-characters-and-then/m-p/2368033#M131345</guid>
      <dc:creator>Rathesh</dc:creator>
      <dc:date>2024-11-15T23:42:23Z</dc:date>
    </item>
  </channel>
</rss>

