Do not input private or sensitive data. View Qlik Privacy & Cookie Policy.
Skip to main content

Announcements
Join us in NYC Sept 4th for Qlik's AI Reality Tour! Register Now
cancel
Showing results for 
Search instead for 
Did you mean: 
MadiF
Contributor III
Contributor III

Table Data Disappears Upon Force Concatenation- but reappears without concat statement?

Currently attempting to concatenate multiple excel files worth of data into a single table, and so far I've achieved this by explicitly concatenating. Now that I'm trying to add a fourth table (which will load when simply loaded by itself), once I add the concatenate statement that data simply disappears. There's no error, but the data is just gone instead of concatenating. Not sure if somehow I'm running into a memory issue of some sort/ table length limitation, if such a thing exists? Script is long but I'll attempt a summary version, for reference:

 

Table 1:

Load*From;

...

//(Table 2)

Concatenate (Table 1)

Load*From;

...

//(Table 3)

Concatenate (Table 1)

Load*From;

...

//(Table 4)

Concatenate (Table 1)

Load*Resident;

...

So, when the red text isn't there, table 4 loads totally fine as an individual. I need it combined with the overall table though. As soon as I add that red text, I get no errors, but the data from that table is nowhere to be found in the overall table. Not sure if the fact that it's a resident load may also play a part? Pls help.

 

Labels (1)
1 Solution

Accepted Solutions
ArnadoSandoval
Specialist II
Specialist II

Hi @MadiF 

As a debugging suggestion, would it be possible for you to create a separate load script for the offending table, standing by its own, and storing its QVD. It is odd that the whole table disappeared, but once we find out an explanation, the reason will be obvious!

If I were you, I will split this load script is 4, e.g. four different applications, just generating the corresponding QVD.

Another option is that you post the load script with us, just copy it into a text file and attach it to your next reply.

hth

Arnaldo Sandoval
A journey of a thousand miles begins with a single step.

View solution in original post

5 Replies
ArnadoSandoval
Specialist II
Specialist II

Hi @MadiF 

I am sure you had looked for your missing data without success, would you try tagging the records of each tables with a field identifying their source; in the example below I introduced the field 'source', once your load is complete, you can add a selector in this 'source' field (if you already have a field named 'source', you may try a different name)

Table 1:
Load *, 't1' As Source From;
...
//(Table 2)
Concatenate (Table 1)
Load *, 't2' As Source From;
...
//(Table 3)
Concatenate (Table 1)
Load *, 't3' As Source From;
...
//(Table 4)
Concatenate (Table 1)
Load *, 't4' As Source Resident;

Additional checks/questions:

  • You did not mention any error message, so I suspect you are not getting any errors.
  • What is in the log file for this process? specially when it is about to load Table 4.

Hope this helps,

Arnaldo Sandoval
A journey of a thousand miles begins with a single step.
MadiF
Contributor III
Contributor III
Author

No,  there are no errors thrown. The output is quite long (I have a lot of data, and the way I'm bringing it together is a bit more complicated than the simplified version above)- but I see nothing unique there.

9:27:13 AM

Started loading data

9:27:13 AM

---

9:27:13 AM

Division_Map << BOData

9:27:13 AM

Lines fetched: 1,560

9:27:13 AM

BO_Prog_Map << JON Summary 10012008-09302020

9:27:13 AM

Lines fetched: 5,963

9:27:13 AM

MonthsBacklog_Temp << FY20 CJI3 AO 9-30-20

9:27:13 AM

Lines fetched: 68,410

9:27:13 AM

MonthsBacklog_Temp-1 << JON Summary 10012008-09302020

9:27:13 AM

Lines fetched: 5,963

9:27:13 AM

MonthsBacklog << MonthsBacklog_Temp            ----------Table 1

9:27:13 AM

Lines fetched: 68,410

9:27:13 AM

MonthsBacklog << BOData

9:27:13 AM

Lines fetched: 70,738

9:27:13 AM

MonthsBacklog << BOData

9:27:13 AM

Lines fetched: 73,159

9:27:13 AM

MonthsBacklog << BOData

9:27:13 AM

Lines fetched: 75,580

9:27:13 AM

MonthsBacklog << BOData

9:27:13 AM

Lines fetched: 77,065

9:27:13 AM

MonthsBacklog << BOData

9:27:13 AM

Lines fetched: 78,589

9:27:13 AM

MonthsBacklog << BOData

9:27:13 AM

Lines fetched: 80,147

9:27:13 AM

MonthsBacklog << BOData

9:27:13 AM

Lines fetched: 81,754

9:27:13 AM

MonthsBacklog << BOData     ---------------All BO Data form Table 2

9:27:13 AM

Lines fetched: 83,403

9:27:13 AM

MonthsBacklog << Financial Workbook Project Tota    ----------------Table 3

9:27:13 AM

Lines fetched: 86,229

9:27:13 AM

OctNovDecBOTemp1 << BOData

9:27:13 AM

Lines fetched: 2,255

9:27:13 AM

OctNovDecBOTemp1 << BOData

9:27:13 AM

Lines fetched: 4,550

9:27:13 AM

OctNovDecBOTemp2 << OctNovDecBOTemp1

9:27:13 AM

Lines fetched: 4,550

9:27:13 AM

OctNovDecBOTemp2 << OctNovDecBOTemp2      ----------Table 4, the one that disappears. I've tried a flag field such as 'from source' and it's nowhere to be found.

9:27:13 AM

Lines fetched: 9,100

9:27:13 AM

MonthsIntoFY << MonthsBacklog

9:27:13 AM

Lines fetched: 1

9:27:13 AM

9:27:13 AM

LinkTable << MonthsBacklog

9:27:13 AM

Lines fetched: 86,229

9:27:13 AM

LinkTable << PFPT

9:27:13 AM

Lines fetched: 13,277

9:27:13 AM

LinkTable << RevisedPlanFunds

9:27:13 AM

Lines fetched: 7,013

9:27:13 AM

LinkTable << ProjDB

9:27:13 AM

Lines fetched: 8,743

9:27:13 AM

LinkTable << Portfolios

9:27:13 AM

Lines fetched: 8,174

9:27:13 AM

9:27:13 AM

Creating search index

9:27:13 AM

Search index creation completed successfully

9:27:13 AM

---

9:27:13 AM

App saved

9:27:13 AM

---

9:27:13 AM

Finished successfully

9:27:13 AM

0 forced error(s)

9:27:13 AM

0 synthetic key(s)

 

ArnadoSandoval
Specialist II
Specialist II

Hi @MadiF 

As a debugging suggestion, would it be possible for you to create a separate load script for the offending table, standing by its own, and storing its QVD. It is odd that the whole table disappeared, but once we find out an explanation, the reason will be obvious!

If I were you, I will split this load script is 4, e.g. four different applications, just generating the corresponding QVD.

Another option is that you post the load script with us, just copy it into a text file and attach it to your next reply.

hth

Arnaldo Sandoval
A journey of a thousand miles begins with a single step.
MadiF
Contributor III
Contributor III
Author

@ArnadoSandoval  Loading the table in a separate app as a QVD and then concatenating worked, thanks! Still no clue why it couldn't just do that in the app itself, but oh well. Will keep that trick in mind in future efforts.

ArnadoSandoval
Specialist II
Specialist II

Hi @MadiF 

I do not know the architecture of your Qlik environment, but one of my premises is the Separation of Concerns, with QVD creator applications and Dashboards applications build from the previously created QVDs.

hth

Arnaldo Sandoval
A journey of a thousand miles begins with a single step.