Skip to main content
Announcements
Qlik Connect 2024! Seize endless possibilities! LEARN MORE
cancel
Showing results for 
Search instead for 
Did you mean: 
Not applicable

Importing multiple tables from a single ODBC source

Hi

Im using an ODBC connection to connect to our iSeries to grab data.

We have thousands of tables on there which are all joined together on Order number, Customer Code etc etc.

I'd like to import say 10 of these tables and match them in QlikView. Is that possible ?

I couldn't figure it out when i first installed QlikView so i did a single SQL select. My Import script is currently:

SQL

SELECT * FROM RAMBATHS.AULT2F2.ORDLNORD,RAMBATHS.AULT2F2.PARTS,RAMBATHS.AULT2F2.ORDHDORD,RAMBATHS.AULT2F2.OEP20
where cono40=cono55 and ordn40=ordn55 and cono55=cono35 and catn55=pnum35 and cono40='01' and cono40=cono20 and cusn40=cusn20 and dseq40=dseq20 and Stat55=' '

That statement grabs 4 of our tables (Order Header, Order Lines, Item Database and Customer Database) its already rather messy and adding more files to this is going to be really bad. Is there a way to import them in QlikView individually and then match them in QlikView on their key fields instead ? i imagine it would be a lot quicker with QlikView matching them rather than the SQL query matching my files.

It takes 18 minutes for the above script to run because were already talking millions of records and each table above has about 60+ fields in it !  I havent even assigned alias's to fields yet so i have messy named fields and im yet to add our descriptions file so i have no part names or attribute names yet so my report looks messy (like item type means nothing in its code form)

Thanks !

Adam

Screen shot of report attached.

1 Solution

Accepted Solutions
Not applicable
Author

Hello Adam.

First, answering your question, yes you can import each one at time and define the key fields.

The first thing you need to know is that Qlikview matches all fields with same name as keys.

Example:

[ORDERS]:

SQL Select ID_ORDER, DE_ORDER from ORDERS; // sql Command

[ITEMS]: //Alias of Table

Sql Select ID_ORDER, ID_ITEM, DE_ITEM from ITEMS; //SQL command

In this example, the 2 tables (ORDERS and ITEMS) will be relacted automatically by the field ID_ORDER.

You can rename the fields (like in SQL) like this:

[ORDERS]:

Load

  ID_ORDER as ORDER_KEY,

  DE_ORDER as ORDER_DESCRIPTION; // This load command will load data of the SQL command bellow

SQL Select ID_ORDER, DE_ORDER from ORDERS; // sql Command

[ITEMS]: //Alias of Table

Load

  ID_ORDER as ORDER_KEY,

ID_ITEM as ITEM_KEY,

  DE_ITEM as ITEM_DESCRIPTION; // This load command will load data of the SQL command bellow

Sql Select ID_ORDER, ID_ITEM, DE_ITEM from ITEMS; //SQL command

In this case, I'd put an alias for each field, but preserving the key name in both tables.

You can, also, filter the data in the second table looking up in the first one, like this:

[ORDERS]:

Load

ID_ORDER as ORDER_KEY,

DE_ORDER as ORDER_DESCRIPTION

Where Year(DT_ORDER) = 2011; //A filter made in first table

SQL Select ID_ORDER, DE_ORDER from ORDERS; // sql Command

[ITEMS]: //Alias of Table

Load

ID_ORDER as ORDER_KEY,

ID_ITEM as ITEM_KEY,

DE_ITEM as ITEM_DESCRIPTION

Where Exists(ID_ORDER); //Only orders that an ID_ORDER already exists will be loaded!

Sql Select ID_ORDER, ID_ITEM, DE_ITEM from ITEMS; //SQL command

Tip: You can ever take a look in the help file (F1)... It realy helps a lot.

Good luck!

Bruno

View solution in original post

6 Replies
Not applicable
Author

Ok im guessing the only way to do this is exporting them first to CSV files and then importing them ? Pity if it can't be running over live data.

Not applicable
Author

You say your running an iSeries (IBM) but what DB are you running? 

You may want to create a VIEW in the database to simplify your load. This way, you are combining tables in the database rather than having qlikview do it.

I'm also fairly new to this but so far I have found it is much simpler to perform your ETL into a view first rather than messing about with qlikview.

Good luck.

Not applicable
Author

Hello Adam.

First, answering your question, yes you can import each one at time and define the key fields.

The first thing you need to know is that Qlikview matches all fields with same name as keys.

Example:

[ORDERS]:

SQL Select ID_ORDER, DE_ORDER from ORDERS; // sql Command

[ITEMS]: //Alias of Table

Sql Select ID_ORDER, ID_ITEM, DE_ITEM from ITEMS; //SQL command

In this example, the 2 tables (ORDERS and ITEMS) will be relacted automatically by the field ID_ORDER.

You can rename the fields (like in SQL) like this:

[ORDERS]:

Load

  ID_ORDER as ORDER_KEY,

  DE_ORDER as ORDER_DESCRIPTION; // This load command will load data of the SQL command bellow

SQL Select ID_ORDER, DE_ORDER from ORDERS; // sql Command

[ITEMS]: //Alias of Table

Load

  ID_ORDER as ORDER_KEY,

ID_ITEM as ITEM_KEY,

  DE_ITEM as ITEM_DESCRIPTION; // This load command will load data of the SQL command bellow

Sql Select ID_ORDER, ID_ITEM, DE_ITEM from ITEMS; //SQL command

In this case, I'd put an alias for each field, but preserving the key name in both tables.

You can, also, filter the data in the second table looking up in the first one, like this:

[ORDERS]:

Load

ID_ORDER as ORDER_KEY,

DE_ORDER as ORDER_DESCRIPTION

Where Year(DT_ORDER) = 2011; //A filter made in first table

SQL Select ID_ORDER, DE_ORDER from ORDERS; // sql Command

[ITEMS]: //Alias of Table

Load

ID_ORDER as ORDER_KEY,

ID_ITEM as ITEM_KEY,

DE_ITEM as ITEM_DESCRIPTION

Where Exists(ID_ORDER); //Only orders that an ID_ORDER already exists will be loaded!

Sql Select ID_ORDER, ID_ITEM, DE_ITEM from ITEMS; //SQL command

Tip: You can ever take a look in the help file (F1)... It realy helps a lot.

Good luck!

Bruno

Not applicable
Author

Thank you very much Bruno

Works a charm.

Just 1 more thing though, i as looking at getting my import time down.

If i make it grab outstanding orders its about 1 minute. If i make it grab all orders (goes back to 2009) it takes 18 minutes. There's a lot of records so thats fair enough. However. 17 minutes of that time is taken up grabbing records it already has in the report and will never changed because their STAT55='C' for completed. Its only new orders or orders at STAT55=' ' that are subject to change.

Is there any way for qlikview to not have to go grab at these millions of records from scratch every time ? The only way i could think of was to maybe run 2 Qlikview reports with the second report matching the first report and our ODBC source ?

Ta

Adam

Not applicable
Author

Hum... Let me see if I got it...

Only orders with the STAT55='' will be changed, ok?

But, an order that isn't completed today, can be in the future... If its the case, maybe you're asking me about a incremental load.

To do a incremental load, you will need to know about 5 things (you can learn more in the help file):

1 - The STORE command;

2 - The QvdNoOfFields function (or other file function - can be researched in help file);

3 - The PEEK command;

4 - Using variables to change dinamically your script;

5 - The IF command in the SCRIPT EDITOR (its different of the IF command in chart edition);

I think something like this will work...

//Verifying if the file ORDERS.qvd exists.

LET V_VerifyIfFileExists = if(IsNull(QvdNoOfFields('C:\MyDir\ORDERS.qvd')), 'false', 'true')

IF V_VerifyIfFileExists = 'false' THEN

 

  [ORDERS]:

  Load *;// will load the data of the SQL command bellow

  SQL SELECT * FROM SQL_ORDERS_TABLE where STAT55='C'; //Loading all completed orders

 

  STORE [ORDERS] INTO [C:\MyDir\ORDERS.qvd] (qvd);

ELSE

 

  [ORDERS]:

  Load * From [C:\MyDir\ORDERS.qvd] (qvd);

  //Retreiving the date of completed orders

  [TMP_MaxOrderDate]:

  Load Max(ORDER_DATE) as MaxOrderDateField RESIDENT ORDERS;

  Let V_MaxOrderDate = num(Peek('MaxOrderDateField', 0, 'TMP_MaxOrderDate')); //Loading the data of the field 'MaxOrderDateField' on the row 0 of the table 'TMP_MaxOrderDate';

  DROP TABLE  TMP_MaxOrderDate; //Won't be used anymore

  //Concatenating with NEW 'completed' orders

  CONCATENATE ([ORDERS]) Load *; //Loading data of the SQL command bellow and concatenacting them into ORDERS table

  SQL SELECT * FROM SQL_ORDERS_TABLE where STAT55='C' and ORDER_DATE >= '$(V_MaxOrderDate)'; //Loading only new 'completed' orders

  //Refreshing the .qvd file with the new data

  STORE [ORDERS] INTO [C:\MyDir\ORDERS.qvd] (qvd);

ENDIF

  //Concatenating with 'changeable' orders

  CONCATENATE ([ORDERS]) Load *; //Loading data of the SQL command bellow and concatenacting them into ORDERS table

  SQL SELECT * FROM SQL_ORDERS_TABLE where STAT55=''; //Loading only 'changeable' orders

I couldn't test... but I think it will work (perharps hahaha)

Regards,

Bruno

Not applicable
Author

Blimey Bruno that must have taken a lot of typing

Thank you very much, i'll have a blast through this afternoon and reference the help file to try and learn what it is im actually writing. I'll report back as soon as i've worked through it.

Your right with Orders changing status. Basically as soon as an order is on STAT55='C' i dont want to have to load it back from my ODBC source again and again because this is where 95% of my data is. cutting this out with speed the report time up massively. The key thing for us here is being able to report over "Live" data as much as possible. So speed of data downloads means we can refresh often. Cognos which is our current BI solution just doesn't offer this flexibility because of the time involved building cubes

Kind Regards

Adam