Unlock a world of possibilities! Login now and discover the exclusive benefits awaiting you.
Hi All,
I'm trying to create a job that will ingest tables from Hive to Greenplum. I have thousands of tables and it will be a tedious process to create a job per table.
One possible way to do this is by using the Dynamic type on the Schema but apparently Hive and Greenplum Components doesn't support this.
I'm currently using a CSV File to test this since tFileInputDelimited uses Dynamic type.
Any idea on other ways to achieve this?
All suggestions will be highly appreciated.
Thank you.
Did you have a fix on this issue? Challenging a relative issue yet no response from anyone and couldn't see this point taking a gander at in google.
Hi, No I haven't yet found a fix for this. I also cannot find anything related to this in Google.
If you happen to have the target table in the right format (I suggest creating a script that creates this for you) then using the Greenplum GPloader you should be able to insert data into greenplum in a fast and efficient way.
Doesn't the GPLoader Component requires Schema in order to insert to the target table? If so, that means that I have to populate it per table.
Is it possible to use the GPLoad component without setting a schema?
If you're not tring to use create table functionality then the schema is optional. (There's a checkbox to force the schema of Talend which is useful if you want to load fewer columns / different order than you have in the target.)
Will try this. Thank you for the information.