First check if you can see these on the calendartime printsheet. If that is fine, then the logs are not connected properly This is usually due to performance counter logs have a different name for the server than the jmeter log, solved by changing the testservername input field in the QVD generator to the machine name of the server. (p 3 in User guide for the analyzer.pdf).
The session file is not critical (or needed) to analyze results but can contain useful information. It is a QMC setting, and not part of scalability tools but I can give some pointers:
* Logging needs to be turned on, and it has to have a valid path to write to.
* It will not write session to it right way, but rather after timing out a session or when qvs is restarted.
* If you have accessed the docs succesfully on the server manually with a browser and nothing is still logged then please double check the QMC documentation if everything is correct.
I can not even see them on calendartime print sheet. I have checked in
data collector properties , it has same name of performance counters as
shown in pdf.
1: can you plz tell me how to restart qvs?
2: while executing the test i am not accessing the dashboard on browser. I
am just making scenario and creatung qvd and analysing result.
Plz tell how to get performance counters values on dashboard? Moreover i am
also not getting CPU and RAM usage.
On 09-Feb-2015 8:50 PM, "Sebastian Fredenberg" <email@example.com>
the machinename in the performance counter should be the same as in the jmeter log, or you have to edit it in the qvd generator. You can look in the demotest folder that comes with the scalability tools for an example. (jmeter is called onlinesales_nehalemex1_[10-90--1]_20111129162749, perf counter log is called NEHALEMEX1_Processes20111128-000434). The machine name is in bold.
1. You restart the qvs from the task manager.
2. A prerequisite is that you can access the document manually via browser, if there is any issue with that then jmeter will never work. Jmeter only simulates users coming in from browsers.
Lastly, I think there is a mixup here as RAM and CPU are from the performance counters. If you do not get those in the dashboard please check through the documentation again, and compare with how it looks in the demotest folder as that is an example on results read in and ready to be analyzed.
As mentioned above, logs are connected by servername. By default, jmeter will take the same servername as displayed in the browser url (salebookqa I assume).
These can differ from the machinename of the server but in that case you need to edit this in the testservername input field in the QVD generator.
Also, again please do not post the same question in several different threads (questions related to perf counters have appeared in 3 or more threads already).
i will not post same qstn in diffrnt threads. I have already entered machine name SE103222 in input field of Testserver name. I made a new test and observe a strange behaviour
Now in Printsheet,Calendar time(when no test is selected i.e for all test) i can see values plotted for:.
Working Set IIS,
ANd no values plotted for :
CPUQVWS , PrivateBytes,QVWS, WorkingSet,QVWS
Why is it so??
ANd also in Printsheet i can not see any of above given values. Plz help.
And When i select a test for some test cases , values in sec1 are shown , and for oyher some not shown. WHy is it so ??
That means that the logfiles are not connected properly. Thus if you select a jmetertest that is not connected to the performance counters you will deselect server performance data (unassociated values).
You need to read through the documenation again and look over how you have connected the logs in the qvd generator as that's where the problem most likely lies.
The reason that you will not see QVWS is that you have IIS installed and not QVWS (according to Sec 1).
Hey regarfing performance counters log i did following:
I created data collector. Start it.
Then performance counter logs named se103222_processes.... Is made. I
copied it to server logs. In qvd generator, i put server machine name to
each jmeter test. And recreate qvd's . i open sc_results. And reload it but
could not see values for working set, private bytes , cpu usage for some of
the test. What you need i am lacking behind.
On 10-Feb-2015 6:25 PM, "Sebastian Fredenberg" <firstname.lastname@example.org>
I created a test scenario in QVscalability tool. I run the test for 10Users, Inifnite iterations , Rampup period: 5 sec , Duration: 180 sec. and after that when i analysed the result:
1: Under Main,Relative Time Tab:
I didn't get RAM per click
2: Under MAin,Calendar Time:
I didn't get data in Messages , Errors per server , CPU QVS , Private Bytes , QVS sessions, Session Started.
3: Under Troubleshoot tab:
In assertion , Actions i am getting 100% error for all the test that i conducted.
4: Under Printsheet, Calendar time , I didn't get value for Working Set , Private bytes.
When i openend the xml file in jmeter, i see all that actions that i added are in RED. Why is it so ??
Also no session file is being made.
I am stuck with a problem. Plz help it is imp.
When i created test scenario and analysed results, i see in troubleshoot
sheet that the ERROR% for all request is 100%. That means all requests are
failing. When i open test.xml file in jmeter and run and see in result
tree, i see that i can access the doc but every action that i put in test
from Access point to clear all are failing. A red triangle for all those
actions are there. Which means none of actions are responding. All are
failing. Although i can access the same doc in browser and do multiple
clicks. Plz help.
On 10-Feb-2015 12:17 PM, "Sebastian Fredenberg" <email@example.com>
You are back to posting the same question in several places, and you are posting questions that have been answereed as well. The original questions asked in this particular thread have been answered, just a few posts up.
You have previously posted that you have not accessed your document manually with a browser. If this is the case
then the instructions provided upon in the documentation has not been followed completely which I believe is what causes the issues you seen. I will however try to address the varuous issues here:
* No session log indicates either that the setup is wrong, or simply that no session has been created. If you
are able to access the document uploaded to the access point from a browser then a session should be created. If
you can not open the application with a browser then jmeter will not work either.
* 100% error means that no request is successful. Likely there is something wrong with the servername or
documentname otherwise there is usually a few succesful requests even though access is denied. Creating the script with the tool requires you to copy the url from a browser and paste it to the scalability tool GUI, if you do not use a browser then likelyhood for errors will increase.
* If you can see RAM without selecting a test and then it disappears when selecting a test, then jmeter and
perfcounters are not connected properly. The documentation covers how to handle this (set machinename in qvd
generator & create metafile). The timezone also has to be correct in the qvd generator. When I say RAm that includes graphs for RAM per click, Private Bytes, Working set.
* If you do not get IIS but QVWS, then have you really installed IIS? Usually one of these are needed, not both.
So to summarize, first of all double check the documentation, and check previous responses in these threads as many of the answers and helpful suggestions are there. Then, you need to access the document with a browser, this is a prerequisite to create the scripts and if this does not work then jmeter will not work. Finally, do not post the same question over and over again in several threads.
I check it Session logs are being created but a bit later.
Now , i have also double check the documents. I would like to tell you the steps i followed:
1: I go to QMC, Under System Tab -> Setup -> Select Servername -> Documents Tab -> Remove the check sign from "Allow Session Recovery".
2: Under Logging Tab -> Change the folder locatn for logs to my locatn. Change Verbosity -> High . Change split files to Daily.
3: Apply Changes in QMC
4: Go to Server MAnager - > Create datata collector by importing Process.xml file
5: Under Roles -> Select IIS (which is already running) and select Application Pool and made req changes.
6: Restart QVS and IIS and start data collector
7: Open QVScalability tool - > Give URL of Document
8: Select Layout folder and then MAke actions(Provided we do not have access point. We directly give Dashboard Link)
9: Generate Script. -> Execute Script.
10: Click on Serve Metafecher -> Put server name and fetch the properties. Serverinfo.xml is created. This step i repeat only once because server name is same
11: In qvd generator , i put folder name. Reload it. The jmeter test comes after reloading. Now under test server name i put server name and click on Create Meta-csv. A meta file.csv is created inside <MAinfolder>
12: I click on SC_Results for analysis. Put my Folder name and reload it.
13: Now i cannot see the Working set and Private bytes values for QVS and IIS. Ram per click values are not there. Under Troubleshoot in Assertions, Actions tab i am getting 100% error for Error% for all my requests.
When i open Test.xml in jmeter i see i can access the doc but whatever actions i put in test scenario , they all are failing. Why is it so ??
also i would like to know what kind of authentication should i use ?? I used NTLM but all my req are failing. In QMC under authentication , allow anonymous is selected and anonymous account is On local computer.
Also i logged in to server windows but when i access the document i have to give username and password on browser. In this case which authentcation works ??
Also i would like to know that if my all req are failing then how it can
show exact number of sessions created, exact number of clicks on evey
actions, response time per action values are also shown. How can these
values be shown if all my req are failing. Plz help
On 17-Feb-2015 1:14 PM, "Sebastian Fredenberg" <firstname.lastname@example.org>
Now it is much easier to help as we can see what steps you have performed. I believe you have previously stated that you get RAM/CPU when not selecting a test in the analyzer. If that occurs the logfiles are not connected properly through servername or the timezone is not changed in QVD generator. The serverinfo.xml also should be put in the testfolder that contains folders for JMeterLogs,QVD etc.
If you have gotten those perfornace counters in the past but not anymore then check so that the data collector is still running. Depending on how the perfornace counters are set up you might need to have all services running when the data collector is started.
As for why you get 100%, the only suggestion I can give is to run it through the jmeter gui and check view results tree. Check both request data and responses for any clues on what errors cause the requests to be failing. You will still get numbers of session & clicks as well as response times as this is measured by jmeter. Even if requests fail - it will still try and it can measure the time to get the response.
As for authentication, you need to set the same authentication in jmeter as you have setup in your server. You also need to make sure you have enought qlikview licenses to simulate your users. The user running the script need to have access to the document, and if you are shown a login prompt then I wonder you are running the test with a user that has access with NTLM.
This can be the cause for the 100% error, so make sure you have a correct setup according to QlikView server reference manual and then reflect that in the script.
THanks but i didnot get you completely.
LEt me tell you some more things:
1: I have installed jmeter , QVScalalbility tool on server machine itself(because it is 64 bit)
2: When i open browser and access the document (on which i have to test), i have to give my user name and password.
3: In QMC under authentication tab , Allow anonymous is selected.
4: In QVScalability tool i have selected NTLM authentication. What you advice ??
5: When i generate script and then execute script. I didnot open jmeter yet. But when in analysis i get 100% error for all the req in assertions,actions under troubleshoot tab , i open script in QVScalability tool in jmeter(with option open in jmeter.) And i run the script in jmeter now. For all the actions that i add from AccessPoint to Clear all , i am getting them in RED color. It seems they failed.
6: In QMC client is selected as IE client. And jmeter prefer Firefox. Hope this is not an issue.
7: Should i need to make Proxy setting in browser ?
8: also since u said that because i ahve to give username and password for opening a doc then can you plz tell me how this will cause 100% error ?? And what kind of authentication should i have so that 100% error should not be there.
9: And also i would like to know that you said even though all requests failed , jmeter try to access actions and show number of clicks and response time. Are they accurate (same as when all requests are successful) ? If yes then can you plz explain how ..
I don't know why all req failing. Plz help. It is urgent
Jmeter will perform actions even if the end result fails. It basically sends those requests and measure the time to get the response ( even if that is an error code). Jmeter will send http requests to the QVS and the request will be seen as failed if the return is for example 400 Bad Request or 401 Unauthorized. The time to get that response will still be measured. To see what the errors you get are, you can check the requests and responses with the view resulst tree component. If you want more in depth detail about jmeter then head over to Apache JMeter - Apache JMeter™.
It acts as a browser, but does not run through a browser so you do not need to configure your browser and your preferred browser to access QMC does not matter
As for the script, select the option which corresponds to you QMC setup. That means which Authorization/Authentication your system is setup to use. If authentication is NTLM then use NTLM, if you have header then select that option.
If access is restricted in any way not handled by the script (login windows could be that), then that can be the reason for getting failures. Running out of licenses will of course also lead to errors.
There is also a possibility that your setup is not supported in the tools.
Thanks .In my QMc under authentication tab, Allow anonymous is selected and NTFS is selected(not DMS). But since sessions are created so i think i ma being authenticated using NTLM. Number of clicks are there on webserver but since server variable name is empty and path contains only | , i think that causes my requests to fail because after clicking on acess point which is hosted on webserver , requests goes to qlikview server(here it is server variable) whose value is empty.
One more thing i would like to add that:
When we generate script in QVScalabilty tool then it set values for:
server, path, webserver, document name. In my case, qvs and iis have same
server name(both running in same) .In script generated, values for server
is empty, value for path is | , value for document name is:
abc_supervisory/salesreport.qvw and webserver name is : cd123456. Does it
has something to do with failures of my req?
On 18-Feb-2015 2:18 PM, "Sebastian Fredenberg" <email@example.com>
Yes that is likely cause for the errors. It seems that not the full url have been pasted in scalability tools or there are special characters in the url that prevents proper parsing of the url into parameters in the jmeter script.
There is an example url in the documentation that looks like:
If that is pasted in the latest uploaded scalability tools (version 0.8) and then opened in jmeter it looks like this:
I have completed the stress testing with QVScalability tool. I was not getting the cpu , ram usage because i was not changing the timezone. Now i can see every graph . This tool really makes work so easier . It rocks.
One last thing i would like to know:
In QVScalability tool_0_8 version , i everytime i fetch log files, a new folder with logs created. So everytime i have to create copy of SC_Results<Mainfolder>. Do we have some method by which my all analysis can be done within single SC_Results, because then i would not be able to compare between different analysis.
Jmeter runs through java, and if the jmeter instance requires too much memory (like in this case) the java process will crash. You might mitigate the issue by allocating more heap memory to the jmeter instances (default is 3GB). This is done on the execution tab, advanced options, in the tools. Amount of memory needed will depend on the size of the script - amount of actions, and amount of threads running concurrently (users).
A comment on your settings however as this is important. 1000 users with 5 seconds rampup means that you are simulating 200 users accessing the document every second up to the point where you have 1000 simultaneous users performing selections at the same time. That is in my experience not a realistic real life scenario when it comes to rampup (chunks of 200 users over a few seconds then suddenly stopping) or concurrent users. It is important to understand what 1000 concurrent users mean, as it is not the same as 1000 concurrent sessions.
A session lives per default for 30 minutes, which means that you can simulate 1000 active sessions by (for example) running 100 concurrent users with 10 iterations. Once one user finishes it's scenario it will start over with a new session. This way you will simulate constant load on the server but require less RAM to be allocated to jmeter to handle the threads and it is likely a more realistic scenario. 1000 concurrent users on the other hand means that you will in effect have 1000 requests sent at the same time, hammering the server.
Going back to your scenario, so what you are simulating is a huge amount of users accessing the document in a very short timespan (5s rampup). All of those users will perform actions specified in your scenario only once and then stop (1 iteration). It will not matter if the duration is infinite as you are only simulating 1 run through of the scenario per user. Depending on the length of the scenario you are basically simulating a burst of activity over a timespan of seconds-minutes. Is this the objective of the test or is it to measure how the performance is when a constant stream of users (realistically set in regards to numbers) acess a document over a period of time? So what I want to come to is that it is very important to think of what a realistic scenario is, and think twice of that the settings provided actually will mean
Thanks but i would like to clear a doubt about Rampup Period.
1: If users are 10 and rampup period is 2 sec then does it means that every new user will start in every 2 sec. Means , After 2 sec , 2nd user , after another 2 sec 3rd user and so on ??
And Does No fo actions and No of clicks are different ?
Under MAin , relative time Tab:
2: In Actions,AvgResponseTime: A horizontal line is always there in every test. What does it signifies ??
3: In #Request tab , what does MS,elapsed Means? What does it signifies ?
Under Main,CalendarTime tab:
4: In Errors per server , What kind of errors are they? Because for same document in Troubleshoot sheet i am getting 0% Error.
5: In JmeterOpenDoc and QVSSessions , i am getting different values of no of sessions. Why is it so? At a particular time ,number of sessions in JmeterOpenDoc and in QVSsessions are different . They should be equal. Right ? Because threads(sessions) are being created by Jmeter.
6: In session started , what does W3WP means?
7: What does append logs show?
1. No, rampup is the time to start ALL threads. From the documentation: "Ramp up – Time in seconds needed to start all threads (users). A ramp up value of 30 and Concurrent users value of 10 means that one user (thread) is started every third second." In your case (10 users, 2 sec rampup) you are starting 5 threads the first second, and then 5 more the second second.
2. It is the average lines for the tests in that graph
3. Number of clicks within the specified millisecond intervals.
4. Those are number of messages in the qlikview event log that are not marked as "Information". It counts the number of errors and warnings reported by QV. What I think you are looking at in the troubleshoot tab is the responses your simulation (jmeter) gets.
5. Two likely reasons: JmeterOpenDoc will not be logged as a qlikview session if it fails. The second reason is that the sessions are written to the sessionlog when the session times out (default 30minutes) or when the qvs is restarted.
6. I don't know which tab you are on but W3WP is IIS.
7. It's an old chart used to detect common mistakes that now handles by the tools. (so disregard it)
It seems like you are focusing very much on details., Most of the value can be taken from the printsheet graphs (after verifying that the test is relevant and accurate).