Unlock a world of possibilities! Login now and discover the exclusive benefits awaiting you.
Hi @rhall and other Experts
I have got delimited .tsv files generated from one of the job. I need a way where I can read files one by one and output into Redshift DB table.
My job initial looks like this.
Is there a way I can create multiple temp tables depending on my iteration and then later merge all these temp tables to output into main redshift table?
Thanks
Harshal.
@Parikhharshal,if you were using the enterprise edition.
1) take the current file name from tFileList and remove the .csv or .tsv and use that name as temp take and action on table is create. the first flow is your design as shown.
2) in the second flow take tDBlist and list the table and iterate and use tOracleInput and toracleoutput .
@Parikhharshal,what was the DB type in tDBoutput?
@Parikhharshal,ok..for merging you can use tRedshiftRow to insert from temp table to main table by using query.
@Parikhharshal,with multiple files also that will work in tDBOuput the action on Table should be create and take the filename and remove .csv and then use that as a table name to create.
tDBrow ..you need to use the above table name as temp table to insert data from temp table to main table by writing query.
we the below type of query in tDBrow,for the stage table you need to take current filename .
insert into category (select * from category_stage);