Do not input private or sensitive data. View Qlik Privacy & Cookie Policy.
Skip to main content

Announcements
Qlik Connect 2026! Turn data into bold moves, April 13 -15: Learn More!
cancel
Showing results for 
Search instead for 
Did you mean: 
Not applicable

Send to excel +65 000 rows result in corrupt data

Hi

I have run in to a problem. When a user export a table with more then 65 000 rows to excel the data are opened as a csv-file. Then the data in the first column in the sheet (column A) gets corrupt. In even 1000 row, or sometimes 2500 row, or 3333 row it's being inserted a blank/null value in the cell, before the celltext.

The easiest way to find this is to make a rownr column next by and the Another column with the excel function =LEFT(A1;1) and then use a filter so show the nulls/blanks. The we can spot that the error occurs in every 1000nd, 2500nd or so row.

Of course I can expand the RowLimitForCsvInsteadOfXls to more rows Before csv kicks in but that is´t really a soultion to the problem.

Can anyone else recreate the problem and does anyone now a solution or why this occurs?

In pic below column F is the =LEFT(A2;1) and column E is 1,2,3...

In first pic no filter are assaign, in 2:nd pic filter are on column F and show nulls

2.png

1.png

0 Replies