Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 
Not applicable

Data Load for 10 million records

Hello Everyone,

I am using the QV 11 to load ten million rows of data from a tab delimited text file which has data in "unicode" format.

The data load gets stuck after importing 1 million records, and doesn't move further. Only the time elapsed keeps on moving.

Any idea why would that happen? Is there any specific setting that I may have missed out to complete the data load.

I waited for couple of hours for the load to complete, but it doesn't complete..  Earliest help/response would be greatly appreciated,

because the entire project depends on this data load now. Thanks!

Tarun

1 Solution

Accepted Solutions
rbecher
MVP
MVP

Hi Tarun,

maybe this is caused by a structural problem in the source data or by a special character. You could try to load all records as fixed line (whole row into one field) and make a deeper analysis on the row where it get stucked.

LOAD @1:n as Line

FROM source.tsv

(fix, codepage is utf8);

- Ralf

Astrato.io Head of R&D

View solution in original post

8 Replies
vivek_niti
Partner - Creator
Partner - Creator

'can u share a sample of the file....

eddysanchez
Partner - Creator
Partner - Creator

The data of this file is incremental? can you use buffer incremental for load.

10 million of records is not very much, you need to process in a computer with more memory or partition the information to load.

Not applicable
Author

Hi Vivek,

Its transaction level data with more than 50 columns. This extract (unicode) comes from another tool and we intend to load it completely in one go to create a QVD file.

Is there a way we can load data incrementally and at the same time keep saving the QVD?

Sorry about not being able to share the data, as there are some confidentiality constraints by the client.

Thanks in advance for your help

Not applicable
Author

Hi Eddy,

I am new to QlikView. Can you please advise how can I use a buffer incremental for load?

We are using a 8 X 32 dedicated server to load the data?

We actually intend to store the loaded data into QVD and then use the QVD for creating dashboard.

Thanks for ur help!

Tarun

eddysanchez
Partner - Creator
Partner - Creator

The way to optimized your qlikview project is using at least 2 qvw's:

First the Extractor qvw only for store each table in qvd's (for example your txt only the columns and rows that you will use). Buffer incremental is only used if your txt file never update or delete its lines, only increment lines below.

You use just adding "buffer incremental " before the load sentence

Then you consume this qvd's

rbecher
MVP
MVP

Hi Tarun,

maybe this is caused by a structural problem in the source data or by a special character. You could try to load all records as fixed line (whole row into one field) and make a deeper analysis on the row where it get stucked.

LOAD @1:n as Line

FROM source.tsv

(fix, codepage is utf8);

- Ralf

Astrato.io Head of R&D
suniljain
Master
Master

please check in log file where exaclty it is stuck up . log file help you to know exact reason behind stuck up .

Not applicable
Author

Thanks so much Ralf

It actually was an issue with Character set quoting.

@Sunil - The log file help identify and Ralf's solution worked. Thank you both Cheers!