Unlock a world of possibilities! Login now and discover the exclusive benefits awaiting you.
Hi Experts,
From my qvw I have saved data for the user daily(In csv format).But day by day data get increased and file size also gets increased.Now my csv file contain 34,00,000 Lakh Data and csv has limit for 10,00,000 Lakh record that's why above 10 Lack record could not save.I want to split data into files and save.How can I split data into multiple files.
For Eg.
Payout.csv(FileRecord=3400000)
OutPut Files:
Payout1.csv
Payout2.csv
Payout3.csv
Payout4.csv
Regards,
Rupali Ethape.
stalwar1jagankaushik.solankiantoniotiman
Hi Rupali,
Can you tell us what all fields you have in your file.
Regards,
Kaushik Solanki
Hi I think it will help u out, find the below sample code .
Table_1:
LOAD RowNo() as row,
Regno
FROM
[test.xlsx]
(ooxml, embedded labels, table is Sheet1);
LET file=0;
LET count=NoOfRows('Table_1');
DO WHILE count > 0
table_2:
FIRST (250)
LOAD
Regno
resident Table_1
where row > $(file)*250 and row < ($(file)+1)*250;
store t into f$(file).txt(Csv);
drop table_2;
LET count = $(count) - 250;
LET file = $(file) + 1;
LOOP;
Hi,
Fileds are random.Above logic I have to add in multiple reports because we save dump for users but data is huge,so they can not read above 10 Lakh that is the main concern.I tried with Rowno,Autogenerate and For loop but I'm confused.How it will divide data into multiple csv and data is different in each Report.For E.g.
First CSV contain 1 to 8 Lakh record then next csv will start with 800001 to 1600000 and next level.
Hi,
If I have to solve this, I will first look for the least level of granularity in data.
For example, A department, or a product group or something similar, which can reduce the numbers of records for me.
Once I identify that then I will create one csv for each of the value of that field and share it with users.
Regards,
Kaushik Solanki