Unlock a world of possibilities! Login now and discover the exclusive benefits awaiting you.
Hello to all!
I'm trying to load a .txt file into Qlik Sense and I'm having a problem capturing all 32K rows in that file. I'm making a very simple load script from a single file and the end result in Qlik just shows me around 19k rows. I downloaded the original file, from the AWS S3 bucket, and compared it to the cvs download of a table created in Qlik from the same file. I noticed that after a specific number of lines it looks like the loading script is ignoring the end of a line and jumping between lines. I'm adding the load script here as an example.
LOAD
[@1] as description,
[@2] as root,
[@3] as main_day,
[@4] as location,
[@5] as main_year,
left(FileName(), len(FileName())-4) as dataset
FROM [lib://xxx_S3/output/]
(txt, codepage is 28591, no labels, delimiter is ';', msq);
I just don't know if it's a problem with the txt file format or a problem with the loading script.
I will appreciate any thoughts on this.
Thank you in advance!
If possible pl share a sample file to understand whats going on..
Hello Digvijay!
I've created a data sample that shows the issues, and I've tried replying to your message several times over the past week. Unfortunately, I got an error message from the community, it looks like the inbox was full.
I didn't know we have the option to upload from here until now.
Sorry for my delay in sharing the sample data in QVD format with you.
Thank you in advance!
Hi,
Did you try csv format, I see similar error mentioned in the below post, see if this helps as it appears debugging would need running load script, for me your connection won't work.
https://community.qlik.com/t5/QlikView-App-Dev/Load-from-txt-not-loading-all-records/td-p/822289
Thanks,
Hi!
Ok, I'll try the CVS format instead.
Thanks!
Your issue might be caused from the msq setting and/or any EOF chars - therefore you may try it with zthe following again:
(txt, codepage is 28591, no labels, delimiter is ';', no eof)
- Marcus
Hello Marcus!
I'll change the delimiter.
Thank you very much for your catch!