Unlock a world of possibilities! Login now and discover the exclusive benefits awaiting you.
Hello all,
So i'm trying to read from the database, my local ram (laptop's) is 8gb and at around 400 million records, I have the message "Object ran out of memory" and it fails.
Thing is, in a try to load using the system ram (64gb) it also crashes for almost a billion rows.
Is it too little for this data?
Depends on how much data each row has and the cardinality of the values. 500 million rows consisting of two numeric fields that both only contain integer values between 0 and 7 won't take up much memory. Records with two hundred columns with millions of distinct values in each column will take up lots and lots of memory.
Okay let me see if i can deliver the info well. This table originally has around 30 fields but i am only calling a foreign key to another table and another one field called amount. The foreign key can be found several times in that table. The foreign key is an integer but of 2-12 digits and field amount is also an integer of 9-12 digits. What do you think @Gysbert_Wassenaar ?
I think you should load 1 million rows and then store that into a qvd. Then use QViewer (https://www.easyqlik.com/) to take a look at the table metadata to see more information about each field like number of unique values and storage required. Then extrapolate that to 500 million rows.
Hi Loannagr,
This is how I have given the suggesstion to one of our company's client. Always to be on the safer side we have proposed them a quite higher range. Hope thism ight be helpful.
Thanks
Prasanthan
Hello @Gysbert_Wassenaar thank you. Is it for free?
@prasanthan_ravindran Thank you, definitely useful! Will forward this one for sure!
There is a free version.