Qlik Community

QlikView App Development

Discussion Board for collaboration related to QlikView App Development.

Announcements
IMPORTANT: Upcoming LEF Database Maintenance, Oct. 3rd - SEE DETAILS
cancel
Showing results for 
Search instead for 
Did you mean: 
Highlighted
Contributor II
Contributor II

Converting Large QVD files (more than 1 million raws) to CSV format

Dear All, I am looking for a script to help me to convert large QVD 10 million raws into CSV format, as you know CSV has a limitation of 1 million raw max, I am not an expert in qlik and if any help will be highly appreciated

1 Solution

Accepted Solutions
Highlighted

Hi, If you wnat to split the qvd it will be easier with an incrmental number field to use to load, youcan craete one using:

 

QvdWithRowNumber:
LOAD * ,RowNo() as RowNumber FROM [qvdPathAndName](qvd);
Store QvdWithRowNumber into [qvdPathAndName](qvd);

 

Having that field you can use an script like:

 

// Variables for qvd read config
SET vQvdReadPath = .\; // Path to the qvd to read
SET vQvdReadName = QvdName; //Name of the qvd to read
SET vKeyField = RowNumber; // Name of the field with RowNumber

// Variable for qvd write config
SET vQvdWritePath = .\; // Path to the qvd to write
SET vQvdWriteName = Split;  // Name of the qvd to write
SET vNumberRowsPerFile = 1000000; // Number of rows per splitted qvd

// Variables used to process
SET vQvdRead = $(vQvdReadPath)$(vQvdReadName).qvd;
LET vNumRows = QvdNoOfRecords('$(vQvdRead)');
LET vNumberRowsReaded = 0;
LET vNumFilesSplitted = 1;

// Bucle for load and store files
DO
  // Generate keys to load (to keep optimized load)
  KeysToLoad:
  LOAD RecNo()+$(vNumberRowsReaded) as $(vKeyField) AutoGenerate $(vNumberRowsPerFile);
	
  // LOAD first x files which keys hasn't be loaded yet
  SplittedQvd:
  LOAD * FROM [$(vQvdRead)](qvd) Where Exists($(vKeyField));
	
  DROP Table KeysToLoad;
  STORE SplittedQvd into [$(vQvdWritePath)$(vQvdWriteName)_$(vNumFilesSplitted).csv](txt);
  DROP Table SplittedQvd;
	
  // Update process variables
  LET vNumberRowsReaded = $(vNumberRowsReaded) + $(vNumberRowsPerFile);
  LET vNumFilesSplitted = $(vNumFilesSplitted) + 1;

LOOP WHILE $(vNumberRowsReaded)<$(vNumRows) 

 

 

View solution in original post

6 Replies
Highlighted

Hi, so far I know csv has no row limit, maybe the limit is when you try to open csv files in excel, wich can have the limit, but if you cant the csv to be processed by another system (like another qvw document) you can usea single csv for all rows

Highlighted
Contributor II
Contributor II

you are right, so if i want to open in excel the data which will export, anyway I can do that by exporting multiple files?

Highlighted
Creator II
Creator II

Hello,

qvd dont have any liminitation in size.

just store qvd in csv format.

store tablename into path\tablename.csv;

but this CSV not open directly but when you load in application all the rows will visible.

Thanks

 

 

Highlighted

Hi, If you wnat to split the qvd it will be easier with an incrmental number field to use to load, youcan craete one using:

 

QvdWithRowNumber:
LOAD * ,RowNo() as RowNumber FROM [qvdPathAndName](qvd);
Store QvdWithRowNumber into [qvdPathAndName](qvd);

 

Having that field you can use an script like:

 

// Variables for qvd read config
SET vQvdReadPath = .\; // Path to the qvd to read
SET vQvdReadName = QvdName; //Name of the qvd to read
SET vKeyField = RowNumber; // Name of the field with RowNumber

// Variable for qvd write config
SET vQvdWritePath = .\; // Path to the qvd to write
SET vQvdWriteName = Split;  // Name of the qvd to write
SET vNumberRowsPerFile = 1000000; // Number of rows per splitted qvd

// Variables used to process
SET vQvdRead = $(vQvdReadPath)$(vQvdReadName).qvd;
LET vNumRows = QvdNoOfRecords('$(vQvdRead)');
LET vNumberRowsReaded = 0;
LET vNumFilesSplitted = 1;

// Bucle for load and store files
DO
  // Generate keys to load (to keep optimized load)
  KeysToLoad:
  LOAD RecNo()+$(vNumberRowsReaded) as $(vKeyField) AutoGenerate $(vNumberRowsPerFile);
	
  // LOAD first x files which keys hasn't be loaded yet
  SplittedQvd:
  LOAD * FROM [$(vQvdRead)](qvd) Where Exists($(vKeyField));
	
  DROP Table KeysToLoad;
  STORE SplittedQvd into [$(vQvdWritePath)$(vQvdWriteName)_$(vNumFilesSplitted).csv](txt);
  DROP Table SplittedQvd;
	
  // Update process variables
  LET vNumberRowsReaded = $(vNumberRowsReaded) + $(vNumberRowsPerFile);
  LET vNumFilesSplitted = $(vNumFilesSplitted) + 1;

LOOP WHILE $(vNumberRowsReaded)<$(vNumRows) 

 

 

View solution in original post

Highlighted
MVP & Luminary
MVP & Luminary

What is the aim for opening a csv with 1 M of records in Excel. Even without further calculations just with sorting/filtering the data it will behave quite slowly. Therefore it may be more suitable to keep the data just as csv and if there is a need to check them as rawdata you could open it with an editor like notepad++ and search for certain values and/or copy & paste the data from there into Excel (whereby this task is always better done in Qlik with the qvd-data).

- Marcus

Highlighted
Contributor II
Contributor II

Many thanks,  for the solution, much appreciated 😀