Skip to main content
Announcements
SYSTEM MAINTENANCE: Thurs., Sept. 19, 1 AM ET, Platform will be unavailable for approx. 60 minutes.
cancel
Showing results for 
Search instead for 
Did you mean: 
AndrewS
Contributor II
Contributor II

Am I losing precision on large decimal values on import? Decimal(38,0)

I'm attempting to recreate legacy financial reports in Qlik Sense prior to retiring a system. The old system uses Oracle and Primary and Foreign keys are stored as random, non-sequential Number(19,0). Our ETL process converts these to decimal(38,0), I assume to avoid overflow issues. I can confirm precision is maintained in our ETL process, with all digits visible on query. However, when loading data into Qlik Sense I see this: 

AndrewS_0-1662563670651.png

With only 14 digits of precision showing, I'm having trouble confirming that my tables are joining correctly. Early in the process I ran into an issue where excel was trimming the last 4-5 digits off of every ID and creating false record matches. Is there a way to change the precision of the data shown, or documentation someone can point me towards that confirms precision is maintained?

Labels (3)
1 Solution
3 Replies
AndrewS
Contributor II
Contributor II
Author

Oh... no....

Any chance I can split the PK fields in the load script with LEFT() and RIGHT() and continue to join tables that way?

AndrewS
Contributor II
Contributor II
Author

For anyone that stumbles upon this in the future:

Right(IDFIELD, Len(IDFIELD)-10) in the load script didn't work. Everything right of the 15th digit was a zero. That said, Qlik did pick up that each table had a LEFTID and RIGHTID field, created a synthetic table and used those fields as primary keys.

I've gone back to my ETL software and I am casting each of the numeric IDs as text (varchar(21)).

The bad news: There are... over 300 of them, so its going to take awhile.

The good news: a quick data load on each app and qlik seemed to pick up the changes without breaking anything.