Skip to main content

Official Support Articles

Search or browse our knowledge base to find answers to your questions ranging from account questions to troubleshooting error messages. The content is curated and updated by our global Support team

An issue has been identified on Qlik Cloud hub, please visit our Status Update Page for details: GET THE LATEST

Qlik AutoML: How are confusion matrix values generated?

Showing results for 
Search instead for 
Did you mean: 

Qlik AutoML: How are confusion matrix values generated?



For binary classification models, AutoML will generate a confusion matrix after running a model experiment. This represents the True Positives, True Negatives, False Positives (model predicted true but target value was false), and False Negatives (model predicted false but target value was true). Classification explanation here






The confusion matrix is calculated from the hold out set for data training.

During the model training phase, the tool automatically splits data between training and hold out sets on a 80/20 basis. 




The training data is split through a k fold iteration and trained on multiple samples.  The hold out dataset is held to the end of this process and then the model is run against this to generate the model metrics and confusion matrix.  You can think of the hold out as showing how the model performs against something it has never seen before.




The information in this article is provided as-is and to be used at own discretion. Depending on tool(s) used, customization(s), and/or other factors ongoing support on the solution below may not be provided by Qlik Support.


Version history
Last update:
‎2022-09-29 06:29 AM
Updated by: