Qlik Sense failing to load large unsigned integers
I'm trying to load data from a mysql database, which has a "bigint unsigned" field containing values such as 180622210702018657 - that's 18 digits.
When loaded into Qlik Sense, the values look like 1.8062221070201e+17. What's worse, these records are unique keys to identify records, and Qlik is often merging several records together - probably because it's failed to load the integers properly.
Unfortunately there is no way to shorten the length of these values, they hold meaning to our clients.
Is there any way Qlik can handle unsigned ints of this size? Something I can do in the load script perhaps to tell it what to expect from the incoming data? I've tried loading it using num() to preformat as an 18 digit number, but that doesn't have any effect - even if it worked it'd be a poor solution as it'd be creating a string representation of the number which is very wasteful memory-wise and really shouldn't be necessary.
Re: Qlik Sense failing to load large unsigned integers
As a hypothetical exercise, I found the lowest value in the field and created a new field (also a bigint) with the lowest value subtracted. It still resulted in quite large numbers - 15 digits instead of 18 as shown below, with the original number on the left and the new one on the right...
Loaded into Qlik, the new numbers look like this:-
And not only that, but it's still cutting off some of the digits & as a result some of the values are merging with others. In the database are 105,100,859 unique values for this field. Here's what Qlik's telling me from a Count() of the field versus a Count(distinct) ...
I'm very surprised that a product which is touted as having the ability to work with large datasets would have such a fundamental flaw, which leads me to think that there must be some way to get it to load this data correctly! If not ... Qlik developers have some work to do.