Skip to main content

Official Support Articles

Search or browse our knowledge base to find answers to your questions ranging from account questions to troubleshooting error messages. The content is curated and updated by our global Support team

Announcements
NEW webinar Dec. 7th: 2023 Outlook, A Pivotal Year for Data Integration SIGN ME UP!

Qlik AutoML: How are confusion matrix values generated?

cancel
Showing results for 
Search instead for 
Did you mean: 
KellyHobson
Support
Support

Qlik AutoML: How are confusion matrix values generated?

Introduction:

 

For binary classification models, AutoML will generate a confusion matrix after running a model experiment. This represents the True Positives, True Negatives, False Positives (model predicted true but target value was false), and False Negatives (model predicted false but target value was true). Classification explanation here

 

confu_matrix.png

 

Explanation:

 

The confusion matrix is calculated from the hold out set for data training.

During the model training phase, the tool automatically splits data between training and hold out sets on a 80/20 basis. 

 

holdout.png

 


The training data is split through a k fold iteration and trained on multiple samples.  The hold out dataset is held to the end of this process and then the model is run against this to generate the model metrics and confusion matrix.  You can think of the hold out as showing how the model performs against something it has never seen before.

 

Environment

 

The information in this article is provided as-is and to be used at own discretion. Depending on tool(s) used, customization(s), and/or other factors ongoing support on the solution below may not be provided by Qlik Support.

 

Version history
Last update:
‎2022-09-29 06:29 AM
Updated by:
Contributors