Qlik Community

QlikView Scripting

Discussion Board for collaboration on QlikView Scripting.

Announcements

Breathe easy -- you now have more time to plan your next steps with Qlik!
QlikView 11.2 Extended Support is now valid through December 31, 2020. Click here for more information.

Not applicable

Data Load for 10 million records

Hello Everyone,

I am using the QV 11 to load ten million rows of data from a tab delimited text file which has data in "unicode" format.

The data load gets stuck after importing 1 million records, and doesn't move further. Only the time elapsed keeps on moving.

Any idea why would that happen? Is there any specific setting that I may have missed out to complete the data load.

I waited for couple of hours for the load to complete, but it doesn't complete..  Earliest help/response would be greatly appreciated,

because the entire project depends on this data load now. Thanks!

Tarun

Tags (1)
1 Solution

Accepted Solutions
MVP & Luminary
MVP & Luminary

Re: Data Load for 10 million records

Hi Tarun,

maybe this is caused by a structural problem in the source data or by a special character. You could try to load all records as fixed line (whole row into one field) and make a deeper analysis on the row where it get stucked.

LOAD @1:n as Line

FROM source.tsv

(fix, codepage is utf8);

- Ralf

8 Replies
vivek_niti
Contributor

Re: Data Load for 10 million records

'can u share a sample of the file....

eddysanchez
Contributor

Re: Data Load for 10 million records

The data of this file is incremental? can you use buffer incremental for load.

10 million of records is not very much, you need to process in a computer with more memory or partition the information to load.

Not applicable

Re: Data Load for 10 million records

Hi Vivek,

Its transaction level data with more than 50 columns. This extract (unicode) comes from another tool and we intend to load it completely in one go to create a QVD file.

Is there a way we can load data incrementally and at the same time keep saving the QVD?

Sorry about not being able to share the data, as there are some confidentiality constraints by the client.

Thanks in advance for your help

Not applicable

Re: Data Load for 10 million records

Hi Eddy,

I am new to QlikView. Can you please advise how can I use a buffer incremental for load?

We are using a 8 X 32 dedicated server to load the data?

We actually intend to store the loaded data into QVD and then use the QVD for creating dashboard.

Thanks for ur help!

Tarun

eddysanchez
Contributor

Re: Data Load for 10 million records

The way to optimized your qlikview project is using at least 2 qvw's:

First the Extractor qvw only for store each table in qvd's (for example your txt only the columns and rows that you will use). Buffer incremental is only used if your txt file never update or delete its lines, only increment lines below.

You use just adding "buffer incremental " before the load sentence

Then you consume this qvd's

MVP & Luminary
MVP & Luminary

Re: Data Load for 10 million records

Hi Tarun,

maybe this is caused by a structural problem in the source data or by a special character. You could try to load all records as fixed line (whole row into one field) and make a deeper analysis on the row where it get stucked.

LOAD @1:n as Line

FROM source.tsv

(fix, codepage is utf8);

- Ralf

suniljain
Honored Contributor

Re: Data Load for 10 million records

please check in log file where exaclty it is stuck up . log file help you to know exact reason behind stuck up .

Not applicable

Re: Data Load for 10 million records

Thanks so much Ralf

It actually was an issue with Character set quoting.

@Sunil - The log file help identify and Ralf's solution worked. Thank you both Cheers!