Skip to main content
Announcements
NEW: Seamless Public Data Sharing with Qlik's New Anonymous Access Capability: TELL ME MORE!
cancel
Showing results for 
Search instead for 
Did you mean: 
Anonymous
Not applicable

Cpu usage is 100%

Hi , my enviroment is 96 GB , 16 core cpu

and my application is 170 MB , 32 M records with 60 fields using a RAM of around 830 MB

data model is just 2 tables linked with one field and pulling these tables from stored QVD.

when I  just do a selection in given list box or chart ...cpu usage is raising to 100% (in fraction of sec coming to 67%) and

in performance tab of task manager all the crore peaks are giving significant peaks at 100%

my question is is it normal to take 100% cpu ? the performance of dashorad is very slow .

what is the solution for this ? I Have already seen the list of best parctices and i'm following those ...

12 Replies
Not applicable
Author

Hi,

Probably not enough core.

Is it a x64 bit machine?

Anonymous
Not applicable
Author

thanks for the reply .....

mine is 64 bit server with 2 processors..

is it good to have 16 core .. have more or less cores helps ?

and data model has 2 table ....1 table with 50 fields and other tabelas has around 9 fields

rajeshvaswani77
Specialist III
Specialist III

Normally 2% to 4% CPU usage is what I have seen. Are there any other services that you need to bring down? Is it QlikView that is consuming the CPU?

Anonymous
Not applicable
Author

it's the server with only qv installed ..i don't see any major softwares running..

when i hit reload ..pu usage jump to 100 % ..when i select any filter it jump 100 % .

what does it indicate ..and performance is very slow .. and my object have calculation time of about 5306 each in a given sheet

Not applicable
Author

why not do a screen shot of the task manager in processes tab.

sort the last by memory usage. and add in the virtual memo column also.

the thing is u have very high memory but lower CPU power.

i think there's a matching which says each 2gb of ram for 1 core.

something like that.

Not applicable
Author

Interesting comment Nick, would love to see where that came from.

I have 200GB Ram in my server and often get the CPUs flat lining at

100%, my data mdoel is fairly complex and the volumes relatively high

(60m rows, 200 columns wide fact, 32 dimension tables joined in star

schema to that fact).

However, I "only" have 24 cores in that server, your statement above

suggests that I should be upgrading to 100 cores!!

I have to say, global bank or not, the chances of that happening are

less promising than me flying to the moon for lunch tomorrow.

Anonymous
Not applicable
Author

i'm sorry to ask you this question how to add virtual memo column ?

Not applicable
Author

Nigel, yes, you are right.

i was about to do that when someone told me that, may be it was 4GB?

The point is how does the cpu usage reach 100%? i have always to have memo line flat, but cpu is up n down.

n my data size is 5GB with processing time of 3 hours.

and cpu never reach 100%

i think i have very good scripting skill then.

Not applicable
Author

Sri, if you take Nigel's scenario and mine.

the only reason could be in your scripting or expression ?

all the point i made are for reference.

please don't buy new cores. i don't believe in that too. but i have intention to do it so i can load some crazy data mart.