Skip to main content
Announcements
NEW: Seamless Public Data Sharing with Qlik's New Anonymous Access Capability: TELL ME MORE!
cancel
Showing results for 
Search instead for 
Did you mean: 
rupaliqlik
Creator
Creator

Split big csv file into multiple csv file

Hi Experts,

               From my qvw I have saved data for the user daily(In csv format).But day by day data get increased and file size also gets increased.Now my csv file contain 34,00,000 Lakh Data and csv has limit for 10,00,000 Lakh record that's why above  10 Lack record could not save.I want to split data into files and save.How can I split data into multiple files.

For Eg.

Payout.csv(FileRecord=3400000)

OutPut Files:

Payout1.csv

Payout2.csv

Payout3.csv

Payout4.csv


Regards,

Rupali Ethape.


stalwar1jagankaushik.solankiantoniotiman


4 Replies
kaushiknsolanki
Partner Ambassador/MVP
Partner Ambassador/MVP

Hi Rupali,

Can you tell us what all fields you have in your file.

Regards,

Kaushik Solanki

Please remember to hit the 'Like' button and for helpful answers and resolutions, click on the 'Accept As Solution' button. Cheers!
Anonymous
Not applicable

Hi I think it will help u out, find the below sample code .

Table_1:

LOAD RowNo() as row, 

Regno 

FROM 

[test.xlsx] 

(ooxml, embedded labels, table is Sheet1); 

 

LET file=0; 

LET count=NoOfRows('Table_1'); 

DO WHILE count > 0  

 

table_2: 

FIRST (250)  

LOAD  

Regno  

resident Table_1 

where row > $(file)*250 and row < ($(file)+1)*250; 

store t into f$(file).txt(Csv); 

drop  table_2; 

LET count = $(count) - 250; 

LET file = $(file) + 1;  

LOOP;

rupaliqlik
Creator
Creator
Author

Hi,

Fileds are random.Above logic I have to add in multiple reports because we save dump for users but data is huge,so they can not read above 10 Lakh that is the main concern.I tried with Rowno,Autogenerate and For loop but I'm confused.How it will divide data into multiple csv and data is different in each Report.For E.g.

First CSV contain 1 to 8 Lakh record then next csv will start with 800001 to 1600000 and next level.

kaushiknsolanki
Partner Ambassador/MVP
Partner Ambassador/MVP

Hi,

If I have to solve this, I will first look for the least level of granularity in data.

For example, A department, or a product group or something similar, which can reduce the numbers of records for me.

Once I identify that then I will create one csv for each of the value of that field and share it with users.

Regards,

Kaushik Solanki

Please remember to hit the 'Like' button and for helpful answers and resolutions, click on the 'Accept As Solution' button. Cheers!