Do not input private or sensitive data. View Qlik Privacy & Cookie Policy.
Skip to main content

Announcements
See why IDC MarketScape names Qlik a 2025 Leader! Read more
cancel
Showing results for 
Search instead for 
Did you mean: 
stucas
Contributor
Contributor

Changing a dynamic schema's columns type

I am trying to write a job in the vein of:

 

     tSetGlobalVariables

               ||

      tOracleDBInput ==> tPostgresDBOutput

 

  • The globalMap holds the table name for the DBInput 
  • The globalMap holds the table name for the DBOutput
  • The Oracle schema is defined as dynamic
  • The Postgres schema has an additional three columns in which I am writing values from the globalMap.

The issue is that the inbound tables have Oracle DATE columns which are translated to PostgreSQL DATE columns - however this means that you lose the time portion in PostgreSQL and the column should be a TIMESTAMP. How can change the inbound DATE to a TIMESTAMP? I've seen come forum comments about putting tJavaFlex and  tJavaRow, and testing the dynamic columns to change the column types but for some reason that seems to interfere with the outbound rows. Most of the Help documentation for doing this only ever uses dates (dates of birth, etc) with the time portion omitted.

 

The context here is that I am writing many routines that are in essence the same pattern; with just the Input and Output schema changing; if I can write and test a generic graph that would save an awful lot of coding.

 

 

Labels (2)
0 Replies