Amount of records does not look unusually high ...
with that long retrieval time, would guess that either the connection to the SQL-server is not that good, or that the fields you are investigating in the WHERE-clause are not indexed.
Option 1 is a more technical one, for option 2 we frequently load all data and then do the necessary filtering in QlikView, once the data has been loaded into memory.
Hi, thanks for the relpy,
I really couldnt comment on the connection and database speeds, all I know is we're using an Oracle back end. Once I have a solution in place then ill check with the Admin of the server to see what load the full download has on the test server, as too much and we wont be given access to the live environment.
could you possible spare some time to elaborate a bit on a method for option 1? I can manage option 2 with my current solution, and will look to test over night.
you obviously don't want to pull millions of records that you don't need... In order to avoid dragging all the records into QlikView and then filtering them in QlikView, you'll have to do it at the database level, using SQL functionality. Something along the following lines:
SQL SELECT ... FROM Table1
WHERE Key1 in (Select Key1 FROM TABLE2 WHERE F1 = 1 and F2=2)
Obviously, if your database tables are large, you need to be aware of your database structure - do you have an index on those fields that your are linking on or filtering on, etc... If you are not the "owner" of your database, I'd suggest that you consult with your DBA.