Unlock a world of possibilities! Login now and discover the exclusive benefits awaiting you.
 AndrewS
		
			AndrewS
		
		
		
		
		
		
		
		
	
			
		
		
			
					
		I'm attempting to recreate legacy financial reports in Qlik Sense prior to retiring a system. The old system uses Oracle and Primary and Foreign keys are stored as random, non-sequential Number(19,0). Our ETL process converts these to decimal(38,0), I assume to avoid overflow issues. I can confirm precision is maintained in our ETL process, with all digits visible on query. However, when loading data into Qlik Sense I see this:
With only 14 digits of precision showing, I'm having trouble confirming that my tables are joining correctly. Early in the process I ran into an issue where excel was trimming the last 4-5 digits off of every ID and creating false record matches. Is there a way to change the precision of the data shown, or documentation someone can point me towards that confirms precision is maintained?
 
					
				
		
 MarcoWedel
		
			MarcoWedel
		
		
		
		
		
		
		
		
	
			
		
		
			
					
		 
					
				
		
 MarcoWedel
		
			MarcoWedel
		
		
		
		
		
		
		
		
	
			
		
		
			
					
		 AndrewS
		
			AndrewS
		
		
		
		
		
		
		
		
	
			
		
		
			
					
		Oh... no....
Any chance I can split the PK fields in the load script with LEFT() and RIGHT() and continue to join tables that way?
 AndrewS
		
			AndrewS
		
		
		
		
		
		
		
		
	
			
		
		
			
					
		For anyone that stumbles upon this in the future:
Right(IDFIELD, Len(IDFIELD)-10) in the load script didn't work. Everything right of the 15th digit was a zero. That said, Qlik did pick up that each table had a LEFTID and RIGHTID field, created a synthetic table and used those fields as primary keys.
I've gone back to my ETL software and I am casting each of the numeric IDs as text (varchar(21)).
The bad news: There are... over 300 of them, so its going to take awhile.
The good news: a quick data load on each app and qlik seemed to pick up the changes without breaking anything.
