Skip to main content
Announcements
Introducing Qlik Answers: A plug-and-play, Generative AI powered RAG solution. READ ALL ABOUT IT!
cancel
Showing results for 
Search instead for 
Did you mean: 
AbiJeev
Creator
Creator

Snowflake output bulk component file creation

Hi,

Am using version 7.3.1 talend real time big data. We are using tSnowflakeOutputBulkExec component to load some data . What we can see is we can modify copy into command but we couldn't define delimiter for the file which is internally created in stage . IS THERE A way to define the file creation . We are using custom local folder ,there we are able to see internal-×××××××.gz files. All are comma separated files.i want to change the delimiter of this internal files while creating itself.

Labels (4)
3 Replies
Anonymous
Not applicable

Hi

I checked quickly with version 8.0.1, there is a Copy command Options in the advanced settings panel, where you can modify the delimiter.

0695b00000OBcpAAAT.png 

Regards

Shong

AbiJeev
Creator
Creator
Author

Hi @Shicong Hong​ ,

My question is different copy into command will copy the FILE CREATED INTO TABLE. But what am asking is while creating file itself how to define delimiter. tdboutputBulkExec will create file internally then copy into table. That option you are mentioning is for changing delimiter while loading in to table which is of no use as the file is getting created only in csv

nickname_Fred_FMO
Contributor
Contributor

Hi from France,

Facing the same problem here :

the input file i use is CSV and has pipe delimiters "|"

tSnowflakeoutputBulk prepares the file into the Snowflake stage

then tSnowflakeBulkExec loads into database. tSnowflakeBulkExec has pipe delimiter.

 

it seems the tSnowflakeoutputBulk makes the file with comma separators, so changing the tSnowflakeBulkExec delimiter to pipe brings problems as showes the error message i get : 'String '"",C0005661,CA935:00022:940,C,ACHT,FF,FF0000940,"",QS1001,2015-09-25 00:00:00.000+02:00,2015,201508,2' is too long and would be truncated'.

As you can see the string is comma separated, even i changed the "," delimiter to "|" in tSnowflakeBulkExec component.

So my conclusion is that i have to use comma delimiter in tSnowflakeBulkExec : but then i have some records rejected because they contain commas...

 

There should be a custom delimiter option in the tSnowflakeoutputBulk that would have to be the same as the tSnowflakeBulkExec one.

 

The solution i found was to tfileoutputdelimited my data to a file with comma delimiter but enclosing the fields by double quotes and use the field_optionally_enclosed_by='\"' file format option : " FILE_FORMAT=(type=csv field_delimiter=',' compression=gzip field_optionally_enclosed_by='\"')".

 

This would be simpler to have a custom delimiter option in the tSnowflakeoutputBulk.