Schema discrepancy between data task and data warehouse view
Hi,
I am confused as to why this is occurring. There is a discrepancy between the datatypes of some columns in the views created in the data warehouse vs the datatypes defined on these columns in the data task. The discrepancy remains even if I recreate the tables. I even tried deleting the tables in the target then recreating.
For example, I have this table called dim_invoice. In the QTC data task, the Nexus column's datatype is INT8.
However, if I go to the view in the data warehouse, the datatype of this column is nvarchar.
I have to point out that my table data is indeed a string type. Example below
However, the behaviour I would have expected is that the QTC data task would create the schema in the target data warehouse with the datatypes defined in the console. They should be the same. The expected result should be that both the target and the data task should reflect the exact same schema, regardless of whether the real data is of that type or not.
Furthermore, in my specific scenario, I would then expect a data load to fail and complain about mismatched data types, prompting me to look into it and fix it. But the main problem here is that regardless of what I do, I can't define the schema manually from the console. It looks like it changes by itself without notifying me.
This issue is inconvenient as it leads to errors later on in the pipeline because datatypes of some columns are not accurately represented. How do I make it so that the datatypes defined in the console will always be created in the data warehouse. Can I please get some assistance with this? It would be greatly appreciated. Thank you 🙂