Skip to main content
Announcements
Have questions about Qlik Connect? Join us live on April 10th, at 11 AM ET: SIGN UP NOW
cancel
Showing results for 
Search instead for 
Did you mean: 
Not applicable

Performance degradation when reading from Oracle Database

Hi,

I have a query that fetches data from a SCD 2 type table.

The query has only the required columns to be fetched and a where clause of START_DATETIME<='date1'and END_DATETIME>'date2'.

date1 and date2 may be same or different values.

The table also has indexes on START_DATETIME and END_DATETIME.

There are 10 CLOB columns out of 73 columns in the table. (Should this be a problem??)

Here are the stats of creating base qvds (replica of db):

record count in millions

Env1:

fetching 0.2m records out of .5m : 53min

Env2:

fetching 0.4m records out of 1.8m : 1hr 5 min

Env3:

Fetching 0.2m records out of 4.5m : 13hr 20 min

Each environment has a set of servers( qlikview servers,db servers etc)

The performance of env3 is causing a major problem. And we are expecting the record count to grow more in future. Incremental load has been implemented. The above stats are for the initial load.

Could you guys give any hint of the problem? Or what should I check to determine the exact cause of the problem?

Thanks in advance.

1 Solution

Accepted Solutions
Not applicable
Author

This issue has been solved. The problem was with the network.

There was drastic difference in the network speed.

Also, fetching clob columns was taking considerable amount of time. Converting these columns to varchar reduced the reload time.

Thanks

View solution in original post

1 Reply
Not applicable
Author

This issue has been solved. The problem was with the network.

There was drastic difference in the network speed.

Also, fetching clob columns was taking considerable amount of time. Converting these columns to varchar reduced the reload time.

Thanks