Unlock a world of possibilities! Login now and discover the exclusive benefits awaiting you.
I am stuck. (Attached is this scenario in an app and an attempt to start fixing with key table) I have the table structure below and was using a KEY table to join all of these tables which works fine with only 1 year of data, but the key table has gotten too big as has the database when loading multiple years of data. Just 2 years created a 5.8GB QV file. I would very much appreciate any insight anyone has on a solution.
Here is the structure, in the example, the indexes having the same name as the table are the primary key (SQL) in each table, the rest are foreign keys.
Table1
Table1Index
Table6Index
Table7Index
Table2
Table2Index
Table1Index
Table5Index
Table3
Table3Index
Table6Index
Table7Index
Table4
Table4Index
Table3Index
Table5Index
Table5
Table5Index
Table6Index
Table6
Table6Index
Table7Index
Table7
Table7Index
Tables 1-5 are very large (20+ million rows each) data tables, 6 and 7 are fact tables. I tried combining (join) some of the data tables, but that results in having to workaround issues with SUM in expressions which slows down the performance too much.
Thanks very much.
Hi.
Do you have synthetic keys ?
Table1
Table1Index
Table6Index
Table7Index
Table2
Table2Index
Table1Index
Table5Index
Table3
Table3Index
Table6Index
Table7Index
...
If I use a link table, there are no synthetic keys. There are 2 tables that are similar that either need to be linked seperately to the link table or need to be Joined together.
Option 1, linking them to a link/key table and keeping the tables separate. This results in the link/key table becoming too big and the application never completes loading. I think this is because it is trying to joing the key tables together.
Option 2 joining the 2 tables together. This results in duplicate data which requires the use of Aggr or Distinct to get around the duplicates in expressions. Since the tables are so large, 20M+ rows each, performance becomes too slow.
All of these links between tables are required.
and the application never completes loading
It seems you're trying your solutions using the whole set of data.
Why don't you set up some kind of test enviroment with less amount of data and load your tables properly so that there is no synthetics ?
I am. It will load with a smaller set of data. Loaded 1 or 2 years works as well but the app becomes too big and performance is terrible. It will not complete loading if using 3 or more years.
It's strange. If you replace synthetics with your links, the file should be smaller and the app should be faster.