Unlock a world of possibilities! Login now and discover the exclusive benefits awaiting you.
Hi, I'm working to merge one dashboard onto another, but I don't want it to interact with any of the fields on the other dashboard yet - just trying to give it a sheet in that document so people have fewer documents to navigate.
The dashboard I'm moving is reading a few hundred sheets of raw data off a network drive. These sheets changed shape from time to time, so it's loading in batches like
load * from archive1;
concatenate load ^ from archive2;
and so on
for the merge, I went like:
Qualify;
WAB:
load * from archive1;
WAB:
concatenate load ^ from archive2;
Unqualify;
It seems to be beating the tar out of our server. Is the Qualify command this very demanding, or am I using it incorrectly?
Hoping I'm just using it wrong this time. If it's that demanding, I'm going to have to go field by field and field as WAB_field down hundreds of lines of stuff...
PS> I know best practice is pull from database or qvds, just not all of this raw data is taken into the database. And on the other hand not all of it has been vetted so can't directly map it to the clean database data version of it yet. I will definitely be setting up a qvd to warehouse historic records nobody will be touching again, but the most recent 12 months of files will likely remain unwarehoused to be safe on that front. Just troubleshooting this qualify behavior here..
Yes it's a lot easier to make such qualifying (maybe a RENAME statement could be an alternatively) within the last step .
- Marcus
and it is a * not a ^ in that load example above, just fat fingered typing the post. (The fields are actually all listed out in script.)
Does it burn the server without the Qualify statement as well?
If yes try to reduce number of fields your pulling out.
I believe if you do it like this, you will get a massive wide table, even fields with the same name in both tables will get into different columns, because of different table prefixes.
Have you tried with just a couple of tables (2) and checked if everything is like you expect it to be?
Qualifying itself isn't very demanding - it just adds the tablename to the field names. But as swuehl pointed out you will have probably created an awkward datamodel with some kind of synthetic keys and/or circular references and/or calculations over loosen tables which could be very expensive or anything similar.
I suggest you rethink your purposes then to combine two/several different datamodels into a single applications is in general possible but it could have some serious disadvantages either in the efforts to develop and maintain them as in the usability then between these datamodels could be unwanted/unexpected dependencies.
I have already done it in my earlier days with qv and can say it confused the users a lot that they had to select/deselect this and that by switching between these datamodels and it needs really efforts to label and document it, using various visibilty- and or calculation-conditions and select/deselect-actions - only to avoid/minimize potential maloperations.
Maybe today with the use from alternate states it could be easier but I wouldn't recommend it then qlikview should be used to simplify things and not the opposite.
Nevertheless if you want to try it I would use two different generators to create single datamodels stored in qvd's and build then the application with them.
- Marcus
Thanks for responses.
1> I'm just trying to use qualify on the one data load from the one qlikview I'm setting onto the main qlikview.
2> I was trying to keep the same table name on all of the worksheets, but I think it was adding -1 -2 -3 ... -25 and the like on its own as they were loaded. So, I think it may have done that bit one of you said where the table was awkward huge even when field names were the same due to the qualify prefixes counting up. QVW ran fine without burning the server before qualify was used, so it was something to do with the qualify.
3 Next approach> I will try letting it build the table from the concatenated worksheets first, then store that into a qvd, then qualify load just the qvd - maybe one distinct qvd will work better than hundreds of little excel worksheets. (And at worst, I'll just have the one patch of script to write 'as WAD_<fieldname>' on.) The fieldnames are otherwise identical, down to the caps. Most recent batch has DataofBirth as DateOfBirth written in to keep a fieldname change in synch with the existing version of that field name as an example. Concats are only building out truly unique field names until qualify started putting numbered table name ahead of them...
4> I could try alternate states instead of qualify, but this data is a 'Before' type view of what the data looked like when it was sent but before it was processed into the database; and the main dashboard data model is an 'After' type view. The structure is otherwise the same, so a massive synthetic key would result unless I did a concatenate and made a qualifier type field for people to use as a toggle. Which might be more awkward for end user than me just putting it onto a separate tab with prequalified fields.
5> End users have been in the habit of consulting two different qlikview documents for these views, so are understanding of the two different meanings of the data. (They were previously searching through hundreds of files on a network drive to see what was cued for processing before it was processed, which was worse.)
Thanks again for the thoughtful responses, I will check back with helpful/correct after I confirm a final solution. Just working to make it two tabs of the same qikview document for now.
I think you have further problems with the datamodel. By 2> you write that you get tables with TABLENAME_1, TABLENAME_1 and so on - this looks that you load data with a file-wildcard like Load * From path\file*.ext but if the files haven't the exact same data-structure they won't be concatenated automatically, qv creates for each different datastructure a new table.
You could solve this by loading the files per filelist with a forced concatenate: Re: Loading file content into QV from a list of files. Also you could look here Get started with developing qlik datamodels then it covers many challenges by building a datamodel.
- Marcus
The files are being loaded from 7 or 8 different archive folders. You are correct that I am using the wildcard on the file names. Fields of new data points have indeed been added over time, this is what distinguishes one archive folder from the next, so I do use the concatenate command rather than expect passive concatenation by qlikview. (There is exactly only one data structure, but many files, in any given archive folder.) The loads are like this (except I have the fields written out in reality):
TableName:
Load Name,DateOfBirth from archive1\*.xls
Concatenate Load Name,DateofBirth as DateOfBirth from archive2\*.xls
Concatenate Load Name,DateofBirth as DateOfBirth, Source from archive3\*.xls
Expected final table is like:
TableName:
Name
DateOfBirth
Source
I think qualify is doing something to ram values onto the fields as they are being loaded, which makes sense actually. Without qualify, if I run script like above, then go look at the table viewer after the script runs, I see the one table called TableName with expected fields.
So what I'm going to try next is to let above happen without the qualify command. Then either store it into a qvd (for the heck of keeping a master archive type file somewhere), or do a resident load from TableName. Then use the qualify unqualify around that qvd or resident load which should appear as just one table at that point. Or worst-case, manually update each field on that qvd/resident load with an 'as table_fieldname'. This should be manageable at the point where it is one piece of load script with 50 fields instead of 8 with 50 fields.
Ultimately, when I copy/paste this piece of script from the one qvw onto the main qvw and reload I want the table viewer to show this table just hanging out by itself without any lines to the other tables on the main qvw. It really is intended to only be an aggregation of files on the network drive that people check when a> troubleshooting unexpected database data OR b> just want to know if a file has been received but not processed into the database yet.
Yes it's a lot easier to make such qualifying (maybe a RENAME statement could be an alternatively) within the last step .
- Marcus
Gave you the correct answer for confirming qualify on one table at the end is less insane than on all the preliminary tables. And for talking it out, and for suggesting that Rename command. I'm going to look at that Rename command because I like using commands that are more explicit and less passive as a kind of best practice. Makes things more solid and also gives qlikview less to think about.