Fileds are random.Above logic I have to add in multiple reports because we save dump for users but data is huge,so they can not read above 10 Lakh that is the main concern.I tried with Rowno,Autogenerate and For loop but I'm confused.How it will divide data into multiple csv and data is different in each Report.For E.g.
First CSV contain 1 to 8 Lakh record then next csv will start with 800001 to 1600000 and next level.
If I have to solve this, I will first look for the least level of granularity in data.
For example, A department, or a product group or something similar, which can reduce the numbers of records for me.
Once I identify that then I will create one csv for each of the value of that field and share it with users.
Hi I think it will help u out, find the below sample code .
LOAD RowNo() as row,
(ooxml, embedded labels, table is Sheet1);
DO WHILE count > 0
where row > $(file)*250 and row < ($(file)+1)*250;
store t into f$(file).txt(Csv);
LET count = $(count) - 250;
LET file = $(file) + 1;