Added actions: Select Excluded, Select All, Select Possible
Added support for 12.20 SR3
Bugfix for export to excel action
Added transfer state (bookmark) support for document chaining
GUI: Template selection more accurately show supported Qv versions
Bugfixes, make sure minimum amount of selections are respected in certain corner cases
Improved performance for high throughput scenarios
Changelog 1.1 -> 1.2
Improved assertions logic to show faulty actions as errors
Change JMeter template scripts to use groovy instead of beanshell
JMeter version 3.0
Java 64-bit JRE 8
Your use of QVScalabilityTools will be subject to the same license agreement between you and QlikTech for QlikView Server. QlikTech does not provide maintenance and support services for QVScalabilityTools, however please check QlikCommunity for additional information on use of these products.
The supported versions of Java are 8 and 9. Supported version of JMeter are 3.0 and 4.0, but not 3.2
So I'm running the scalability tool, I have setup the data collector, but when I generate my script, and execute it. Then create the QVD's and read those I find that the it doesn't appear that the data collector is getting the cpu or ram usage stats. I'm using the template included also. Anyone have any suggestions?
So I'm using version 8 here and I just can't get it to work. The tests run, I get the log files, I can create the qvd's and look at the stats, the problem is that none of the performance stats really load. I don't get cpu or ram usage stats and I think the reason for this is, is because the following info doesn't get filled in: QVS_Servername, QVS_Cores, QVS_Ram, QVS_Ram_Min%, QVS_Ram_Max%, %Key_ServerName, cores, ram, RAM_Min%, Ram_Max%,
Where do these get pulled from? I look at the performance logs and I see that stats are being pulled, but not in the qvw.
Any help would be greatly appreciated. I do have a zip file of everything so if you would like to review it please let me know. Though it is nice to see the response times, what I really need is the cpu and ram usage stats.
Open the ServerMetaFetcher.qvw and enable Macro. Add all IP addresses or machine names, for all machines used in the testing, to the table. Click the "Fetch Properties"-button. Check the details table to ensure that everything is collected. Make sure that the WorkingSet-limits should be set according to the settings in QEMC->System->Setup->Performance for the machines, as these are defaulted to min 70 and max 90 and not read from the server.
Click the "Export ServerInfo"-button. This will save an xml file, called ServerInfo.xml, adjacent to ServerMetaFetcher.qvw.
I have used that, I have enabled macros in the Management Console. The issue I have is that I put down the following IPs/servernames (localhost, 127.0.0.1, mcqkvwwap001, 184.108.40.206. These all deal with one server as the web server and qlikview server are on the same machine and I'm running my tests on that machine. When I click fetch properties only localhost and 127.0.0.1 get filled in with data, not the servername or external ip address.
Now when you say enable Macro are you talking on the server side or in the app? If in the app how do I do that?
Macros needs to be enabled in the QlikView desktop running the ServerMetaFetcher.qvw. Macros can be enabled by clicking allow any in the popup when opening it the first time or by going to Tools->Edit Module and changing Current Local Securiy to Allow System Access.
The metafile can also be created from within the QvScalabilityTools by clicking Tools->Get server info and providing the correct server and output folder in the dialog.
Thanks for this, this fixed the issue with not being able to pull the server information. But with this pulled I don't see that my server info xml file is any different then when I just manually added in the hostname and external ip of the server. When I use the QVD generator and SC_Results, I'm still seeing that the server settings are still blank for everything.
<ProcessorName>Intel(R) Xeon(R) CPU E5-2670 0 @ 2.60GHz</ProcessorName>
For server settings, what should or shouldn't be checked. I unchecked "Allow Session Recovery",
Under Logging, I have checked "enable session logging, enable performance logging (1 minute), enable event logging and enable audit logging" Event Log Verbosity = high and split logs daily.
Under the Authentication tab, it is: Allow Anonymous, Anonymous Account = On Local Computer, Authorization = NTFS, Micellaneous checked = allow dynamic data update, allow unsafe macro execution on server, allow extensions, allow macro execution on server and compress network traffic.
Performance working set is 70% and 90%.
Should I be setting my logs to normal logging or debug logging?
I'm still not sure where the server settings get set from and why when I look at SC_Results all of the server settings are blank.
I see that you ran the tool locally instead of using an external load client as per recommendation. Running it locally will obviously affect the test results depending on how heavy the test is, as for a heavier test QlikView engine and JMeter might contend for resources. Running it locally will also not include the response times induced by your IT infrastructure thus being faster than a real user will perceive it. However depending on the purpose of the test it might be just fine.
The problem you are experience now is due to the JMeter log files being named baseline_localhost_[1-1-3]_140702120437. Rename the file replacing localhost with MCQKVWWAP001 and the generator should be able to find lines connected to the test.