Hi,
I am developing an ETL job whereby a variety of different JSON schemas are converted to RDBMS. Each key in the JSON is mapped 1 to 1 to a column in a table. Each JSON schema has about 5 columns that are the same across all schemas, the rest are different. Right now, I am manually creating a JSON file in metadata for each schema, and manually mapping it to the corresponding table. However, in a few months, the amount of different JSON schemas received will increase exponentially, so manually creating a metadata file and mapping for each schema becomes unfeasible.
Is there a way I can build a job that generically takes any JSON schema I throw at it, and automatically maps the value to a column in a table (where the name of a column = the JSON key), maybe using tSetDynamicSchema, or write/extract dynamic fields?
Thanks,
Nilan
edit: Just to add, each different type of schema goes into a different table. The data from the JSON defines what the target table is, which is currently achieved by setting the relevant key/value from the JSON as a global variable, then using that variable as the table name in tMySQL input.
edit2: I have also seen the page:
https://help.talend.com/pages/viewpage.action?pageId=5671283 which shows how to deal with changing data structures. However the dynamic data type isn't supported by tFileInputJSON, so there may be a workaround for this?