Skip to main content
Announcements
Introducing Qlik Answers: A plug-and-play, Generative AI powered RAG solution. READ ALL ABOUT IT!
cancel
Showing results for 
Search instead for 
Did you mean: 
wangbinlxx
Creator
Creator

How to use Dynamic schema to bulk load into redshift?

Hi 

I’m developing a generic job to load data from CSV file into redshift. I try to use "Talend Dynamic Schema" (https://www.youtube.com/watch?v=dqja5wZRq0k ).

 

  1. I manage to load csv data into Talend with Dynamic type as long as CSV file has header. I can load the data into Oracle with tOracleOutput . 
  2. When I change the target database to Redshift. I hit the wall. When I try to use tRedshiftOutputBulkExec to load it, it gives me the warning that "This component doesn't support Dynamic Type."
  3. I receive the following error if I insist to run. Instead of check the CSV header to get the column name, it gets from schema definition, then the loading part fails. 
    "java.sql.SQLException: [Amazon](500310) Invalid operation: column "alldata" of relation "ft_clicks" does not exist;" 

    4. If the file is already in S3, is it possible to this task?

0683p000009Lybc.jpg

 

0683p000009LzNX.jpg

 

Thanks,

 

Labels (4)
3 Replies
fdenis
Master
Master

yes you can load S3 file with tRedshiftBulkExec.
you can write this file with tRedshiftOutputBulk (I think you can also manually do it)
Good luck
wangbinlxx
Creator
Creator
Author

Hi Francois

I try tRedshiftBulkExec, which is the same result as tRedshiftOutputBulk. It gives the same error "This component doesn't support Dynamic Type" .

 

Do you have some example on "manually do it" ?

 

Thanks,

 

 

fdenis
Master
Master

sometime to use bulk load you have to add fiels definition into the bulk file or into an other file.
I do not have make S3 bulk bu all Bulks have common requirent.
You have to found S3 file definition or create one file with table definition and retrieve file to analyse.