Unlock a world of possibilities! Login now and discover the exclusive benefits awaiting you.
I am trying to load a text file 16M rows in to QV but about 9 lines are getting excluded. When I load the same text file in to SQL server and connect it to QV, then all the records show up correctly.
Since this approach was working, I want to now store the data in to a QVD file so my load script looks like this:
SQL Connection string
Table1 :
SQL SELECT * FROM "Database".dbo."Table1";
store Table1 into $(vQVDLocation)Table1.qvd (qvd);
But this fails with following error after at about 2 M rows:
Next tried storing in .csv file and that failed after about 5M rows.
I couldn't find anything helpful online, so any suggestion is appreciated.
Have you checked that the machine doesn't run out of RAM or disk space?
Have you tried creating a new document that only loads the data from the SQL database and stores it into a qvd?
Have you checked that the machine doesn't run out of RAM or disk space?
Have you tried creating a new document that only loads the data from the SQL database and stores it into a qvd?
Did you try to debug your code and run limited load? Let's say around 10k rows. If this will work fine and you will not receive any errors - that would suggest machine ran out of RAM/disk space - like gwassenaar mentioned.
I would suggest maybe you can try to load your table in chunks, store it in smaller qvd's and then use them in optimized load (for example in the loop) and concatenate the results in your transform qvw.
Are there special characters into your table1?
I was able to store qvd when I used a brand new Qvw file instead of using an existing shell. I even loaded the qvd and dashboard refreshed perfectly.
Thank you for all the responses. This is an awesome community!