Skip to main content
Announcements
Qlik Connect 2024! Seize endless possibilities! LEARN MORE
cancel
Showing results for 
Search instead for 
Did you mean: 
Not applicable

Merging Two Documents - ?

Hi,

I am trying to merge two seperate models/documents and have had no luck.

In the script there are 16 existing tabs and 7 new ones.

On the front end there are 13 existing sheets (all with their own reports/data) and two new ones.

I am really having trouble keeping the integrity of the new (and old) data. The most common issue is "new" reports (on the sheets I'm trying to add to the existing model) have ten times too much data in them or have 20-30 duplicate records.

I know to synch-up the names in the script, but is there something that I am not doing?

Anything will help, feel free to reply with ideas.

Thanks in advance.

-Chris

6 Replies
gauravkhare
Creator II
Creator II

Hi Chris,

Maybe you should use Binary file option..Here in this you can add your already existing application to your new ones..In doing so you need not fetch again all your fields required for the second report but in turn only those fields which are missing from the first report... my explaination seems to be a bit complicated and I' am sorry for it..Kindly make use of binary option It will solve your problem Chris!

Not applicable
Author

Thank you for the quick response, Gaurav!

I will try the Binary file right now and will let you know how it goes!

-Chris

Oleg_Troyansky
Partner Ambassador/MVP
Partner Ambassador/MVP

Chris,

BINARY load can only help you simplify your script by reusing the first portion of it "as is" instead of repeating it...

Based on your description of the problem, your real issue is the integrity of the data and the challenge is merging the two data structures correctly. The duplication of records points at popossible "many-to-many" relationships that got created as a result of your "merger"...

It's hard to prescribe in the dark, but generally speaking you need to examine the two data models, the levels of details in each table and the relationships between the tables, and then carefully connect the new tables with the relevant links to the existing tables, ensuring that no duplication is introduced into the mix.

Another approach would be to start from those expressions that show duplicated results. Examine the fields used in those expressions, and the tables that the fields belong to. Those particular tables must be linked incorrectly - some keys might be missing, or some "accidental" links might have to be eliminated.

Not applicable
Author

Gaurav,

I never got this working correctly. But, what I did was, went into the document properties, the Open tab and selected "Initial Data Reduction Based on Section Access", "Strict Exclusion" and left "Prohibit Binary Load" unchecked, and then reloaded the model.

I kind of feel like this may not have been what you meant.

Did you mean to go into the script and do a Binary Load of each of the new tabs? Or....? Sorry to keep asking, what may be rudimentary questions, but I really want to get this correct.

Let me know please.

Thank you again for your help.

Not applicable
Author

Oleg,

I didn't see your post for some reason until now.

You're absolutely right. The real issue is the integrity of the data. The data duplication is a real problem too.

If the BINARY LOAD doesn't produce the results I'm looking for I will have to go through each record. That's the only other way I can think of.

Thank you for your input, Oleg.

Not applicable
Author

I did a BINARY LOAD of the model that I'm trying to bring in, and I really didn't notice anything. Is there something I'm forgetting?