Unlock a world of possibilities! Login now and discover the exclusive benefits awaiting you.
I have a SQL Query that returns probably 300 million rows, 12 columns. It generates a .txt file that is 200 Gb. Instead of loading that, I could create a stored procedure and call that from Qlik.
1) Can Qlik handle that much data?
2) I already separated the DateTime into Date and Time and took off the seconds so there are fewer unique values for Qlik to store. Any other advice?
A text file isn't indicative of what you'll get in QLik, since it's storing things in an entirely different way. You should start by loading this into a QVF and see how large it is. Qlik doesn't need to handle your text file, just its own internal storage, and if you have enough memory it should be able to handle the data.
Generally speaking, some tricks that reduce file size include:
reducing distinct values (which you seem to have already started on)
Rounding / flooring numbers if unnecessary precision exists (and explicitly formatting the result to the desired format)
Using autonumber or otherwise hashing out any key fields
dropping any unused fields
Qlik Sense Document Analyzer (which is depreciated and has a replacement, but the original link is https://qlikviewcookbook.com/tools/#squelch-taas-accordion-shortcode-content-4 ) can help you with finding out where you might be able to trim down your file.