New features are always being added to Qlik Sense. Today I will provide an update on three topics I have blogged about this year. The topics are:
NL Insights
Straight table
Filter pane
NL Insights
The NL Insights object, which use to be found in the Dashboard bundle, is now a standard chart object. It now “fully supports the Qlik Cloud UI in multiple languages, accessibility requirements, and reporting / exporting.” The functionality of the NL Insights object has also been updated so that it is easier to add insights for the visualizations in an app by allowing developers to simply select a chart they would like interpreted. Once a NL Insight object is added to a sheet, the developer can click on the Add chart button to see a list of objects that are on the current sheet. Natural language insights will be shown for the selected visualization. This is easier for the developer and removes the need to manually select dimensions and measures.
Filter Pane
At its basis, the filter pane provides a list box that the user can use to make selections. By default, when a user makes a selection in a filter pane, the selection toolbar pops up prompting the user to confirm the selection, as seen in the image below.
It is now possible to turn off the selection toolbar and immediately confirm the selection when it is made. This can be done in the properties window by unchecking the Show selection toolbar checkbox.This is useful if the filter pane is displayed as a grid, versus column, and the filter pane is being used as a button bar.
Note that when the selection toolbar is off, users can still access the select all, select alternative and select excluded options via the sheet toolbar.
Straight Table
The straight table can now be downloaded as an image or multi-page PDF up to 50 pages. Simply right-click on the straight table, select Download and click on Image or PDF.
There are always new features being added to Qlik Cloud. Use Qlik Help to stay up to date or subscribe to the news feed.
Thanks,
Jennell
...View More
The ability to make data-driven decisions in real-time is a cornerstone for many businesses. Using Qlik AutoML's real-time prediction API let's you seamlessly integrate predictions on the fly in your web applications.
In this guide, we will walk through the process of leveraging Qlik AutoML's real-time prediction capabilities using Node.js.
Before using the API, we first need to create an ML experiment in Qlik cloud and deploy it.
Creating the AutoML Experiment and deploying it
There are lots of help articles around this topic, but for our simple case, we will be using a Customer Churn dataset to train our model (You will find the file attached at the end of this post).
Start by creating a new ML experiment from your Qlik cloud hub
After uploading the Training file, select the “churned” column as the target to make our predictions. Make sure to de-select the “AdditionalFeatureSpend” and “DaysSinceLastService” columns if you want to follow along with this example.The features (columns) selected in this step should later on match the schema of the input dataset we want to get real-time prediction on.
Once we run the experiment to train the model, the best performing algorithm is automatically selected, you can now reconfigure a different version or simply press Deploy to create a new ML deployment, let’s do the latter.
In the Deployment screen, you can run predictions on new apply datasets right in the hub, or in our case, we will use the Real-time prediction API to programmatically make predictions from our node.js app.
Creating the node.js app to make real-time predictions
Head to the Real-time predictions pane as shown above, this is where we will be able to grab the API URL.
To interact with the real-time prediction API, you'll require an API key from Qlik Cloud. Users with the 'Developer' role have the necessary privileges to generate this key. You can learn more about generating one here.
Setting up the Environment
Begin by initializing a Node.js environment and installing the necessary packages, run:
npm init -y
then
npm install axios dotenv fs
API Configuration
In the .env file, define the Real-time prediction API URL and the API Key:
API_KEY=eyJhb....
API_URL=https://....
Next, create an index.js file and include the following:
Data prep:
We read from the csv file using fs and format the data to align with the real-time prediction API's requirements
const axios = require('axios');
const fs = require('fs');
require('dotenv').config();
const api_url = process.env.API_URL;
const api_key = process.env.API_KEY;
const rawData = fs.readFileSync('./customer-churn-apply-subset.csv', 'utf8');
const rows = rawData.split('\n').slice(1).filter(line => line.trim() !== '').map(row => {
const [
AccountID, Territory, Country, DeviceType, Promotion, HasRenewed, PlanType,
BaseFee, NumberOfPenalties, CurrentPeriodUsage, PriorPeriodUsage, ServiceRating,
ServiceTickets, StartMonth, StartWeek, CustomerTenure, Churned
] = row.split(',');
return {
"AccountID": AccountID,
"Territory": Territory,
"Country": Country,
"DeviceType": DeviceType,
"Promotion": Promotion,
"HasRenewed": HasRenewed,
"PlanType": PlanType,
"BaseFee": parseFloat(BaseFee),
"NumberOfPenalties": parseInt(NumberOfPenalties, 10),
"CurrentPeriodUsage": parseFloat(CurrentPeriodUsage),
"PriorPeriodUsage": parseFloat(PriorPeriodUsage),
"ServiceRating": parseFloat(ServiceRating),
"ServiceTickets": parseInt(ServiceTickets, 10),
"StartMonth": StartMonth,
"StartWeek": StartWeek,
"CustomerTenure": parseFloat(CustomerTenure),
};
});
const formattedRows = rows.map(record => [
record.AccountID,
record.Territory,
record.Country,
record.DeviceType,
record.Promotion,
record.HasRenewed,
record.PlanType,
record.BaseFee,
record.NumberOfPenalties,
record.CurrentPeriodUsage,
record.PriorPeriodUsage,
record.ServiceRating,
record.ServiceTickets,
record.StartMonth,
record.StartWeek,
record.CustomerTenure
]);
Crafting the API Request:
here, we define the headers and body content for the POST request to the API.Notice that the schema matches the features we used to train our model.
const json_data = {
"rows": formattedRows,
"schema": [
{"name": "AccountID"},
{"name": "Territory"},
{"name": "Country"},
{"name": "DeviceType"},
{"name": "Promotion"},
{"name": "HasRenewed"},
{"name": "PlanType"},
{"name": "BaseFee"},
{"name": "NumberOfPenalties"},
{"name": "CurrentPeriodUsage"},
{"name": "PriorPeriodUsage"},
{"name": "ServiceRating"},
{"name": "ServiceTickets"},
{"name": "StartMonth"},
{"name": "StartWeek"},
{"name": "CustomerTenure"},
]
};
const headers = {
'Authorization': `Bearer ${api_key}`,
'Content-type': 'application/json',
'Accept': 'text/json',
};
Sending the Request & Handling Response
We utilize axios to transmit the data and receive real-time prediction results (you can use any http client for this)
axios.post(
api_url,
json_data,
{
headers: headers,
})
.then(response => {
console.log(response.data);
})
.catch(error => {
console.error("Error:", error);
});
Now you can run the node app:
node index.js
The result we get are the predictions of the Customer Churn for our apply dataset with the second column representing whether the customer is predicted to Churn or not, 3rd column is the probability of Churned_no, and the 4th column is the probability of Churned_yes.
We can compare these results to the ones we get on the hub if we were to create predictions there:
With Qlik AutoML's real-time prediction API at your fingertips, you’re now ready to integrate real-time predictions into your web apps whether it’s to enrich your visualizations or build interactive what-if scenarios, you can easily make impactful real-time decisions.
API docs: https://qlik.dev/apis/rest/automl-real-time-predictions
You can find the full code and dataset files used below.
...View More
Insight Advisor – our intelligent AI-assistant supports a variety of advanced insight generation and automation experiences including search driven insights, conversational analytics, and analysis types – our unique framework for choosing a type of advanced analysis and generating visualizations, NLG, and even smart sheets. You’ll get a look at key driver analysis, to uncover the factors driving a selected metric. And you’ll get a deeper dive into how you can customize insight generation and natural language processing through our business logic layer.
See how AI-enhanced Insight Advisor Analysis Types: Smart Sheets can automatically generate an interactive performance management dashboard to keep track of your goals in just a few clicks. Yes it is that easy!
There have been many new capabilities that give developers ways to customize and style an app. In this blog, I will review how the sheet header and toolbar can be toggled on and off and the benefits of each, as well as things to consider. The sheet header and the toolbar both appear at the top of an app. The sheet header, outlined below in yellow, includes the name of the sheet, an optional logo or image, and previous and next sheet navigation arrows.
The toolbar is the row above the sheet header. It includes buttons and links to Notes, Insight Advisor, selections tool, bookmarks, sheets and edit sheet.
The toggle for the sheet header and toolbar can be found in the app options section of an app. Open app options by clicking on the arrow next to the app name at the top center of the app. From there, click on the App options icon on the right.
Once the app options are open, you will find the toggles for Show toolbar and Show sheet header.
One of the main benefits of removing the sheet header and toolbar is to gain more space on the sheet. The space that is used by the sheet header and toolbar become area that developers can use for additional filter panes and/or visualizations. Another benefit is developers can add custom capabilities to replace the Qlik Sense defaults. For example, a developer may want to create their own navigation buttons and have more control over the options that are available to the user. If the sheet(s) are being used to create a PowerPoint presentation, removing the sheet header and toolbar makes the presentation look more polished.
Now let’s discuss some things to consider when removing the sheet header. If the sheet header is removed, alternative sheet navigation should be provided for the user. It is possible to use your keyboard to navigate the sheets, but many people do not know that so custom navigation should be created by the developer using buttons or links. In the image below, buttons are used.
In the image below, buttons are used again but the highlighted button indicates the sheet the user is on. So, in this example, the developer has replaced the sheet navigation and the sheet title that was included in the removed sheet header.
A sheet title can also be added to a sheet using a Text & image object. The custom navigation can be designed to match a theme or company brand which gives the developer a lot of flexibility and can give a company’s apps a consistence look and feel.
When the toolbar is toggled off, features are hidden but they are not removed from the app entirely. This is great but not all users may be aware of alternative ways to access the features on the toolbar, so it is important to keep this in mind. For example, users can still create notes for a visualization or view notes for a visualization by right-clicking on a chart, selecting the eclipse (…) and then selecting Notes. Another example is users can still access bookmarks or the sheets in an app via the App Overview. Users can still ask questions via Insight Advisor, so not functionality is loss with the removal of the toolbar. Other things to consider is that while selections can still be made via filter panes and visualizations, without the selection bar, users may not be aware that selections have been made. This is why the developer needs to make sure there are filter panes or some way for users to know what has been selected. When it comes to selections, buttons can also be used to perform actions such as clearing selections and making selections in a field.
The overall goal is not to make things harder for the user so knowing possible issues and designing for them is smart. While there are benefits in toggling off the sheet header and/or toolbar, developers must consider how this may impact their users and how their users will use the app. The user experience can be just as good with the sheet header and toolbar toggled off if the developer plans well for an intuitive user experience.
Thanks,
Jennell
...View More
Ever found yourself stuck with a messy pile of data that seems more like a labyrinth than a pathway to clean insights? You're not alone. Today, we're diving into the world of data cleaning in Qlik Sense to help you uncover the analytical potential hiding behind your data.
The Importance of Data Cleaning:
Imagine you're baking a cake. Would you eyeball the measurements of your ingredients? Probably not, unless you fancy a disaster cake. Just like one poorly measured cup of flour can ruin your entire recipe, a small data error can throw off your whole analysis. That's why, before you dive into the fun part—data analysis—you've got to make sure your key ingredient (data) is as clean and precise as possible.
Why Data Cleaning is More than Just a Chore:
It's not just about tidying up; it's about quality control. Skipped steps or overlooked errors can lead to inaccurate results that could misinform your business decisions.
Data Accuracy:The accuracy of your analytics depends heavily on your data's quality. Data cleaning helps to weed out errors and inconsistencies, ensuring your insights are both trustworthy and actionable. Tools like mapping tables or functions like SubField can be invaluable in this stage.
Data Consistency:Inconsistent data formats or naming conventions can be a real roadblock. Qlik Sense offers features like the SubFieldfunction and mapping tables to help you standardize data for consistent reporting and visualization.
Data Integration:When you're integrating data from various sources, alignment is crucial. Qlik Sense provides numerous functions that help in aligning these disparate datasets into a cohesive, unified form.
Enhanced Visualization and Performance:Clean data doesn't just make your visualizations more meaningful; it also enhances the performance of your Qlik applications. Expect faster data retrieval and more efficient analysis when your data is in good shape.
Data Cleaning techniques in Qlik Sense:
Duplicates removal:Duplicate records can distort your analysis and reporting. Qlik offers built-in functions like Keep when loading tables or the DISTINCT keyword in your script to load only unique rows.
Missing values:You can address missing values by removing records or filling in gaps based on specific criteria. Functions like IsNull, IsNullCount, and NullAsValue come in handy.
Data formatting:Using the numerous string functions available in Qlik Sense, you can standardize data values to a consistent format. For example, the Upper, Lower, Date, and Num functions can be used to unify text or dates.
Data manipulation:Sometimes the data you import into Qlik Sense doesn’t exactly fit your needs. Qlik offers ways to reshape your data accordingly. For instance inconsistent field values can often occur when pulling data from multiple tables and this inconsistency can disrupt the connections between data sets. An efficient solution to this is to use Mapping tables.
Mapping Tables:
These types of tables behave differently than other tables in that they are stored in a separate area of the memory and are strictly used as mapping tables when the script is run, they are then automatically dropped.
Let’s take a look at how to do this and the different statements and functions that can be used:
MAPPING prefixThis is used to create a mapping table. For instance:
CountryMap:
MAPPING LOAD * INLINE [
Country, NewCountry
U.S.A., US
U.S., US
United States, US
United States of America, US
];
Keep in mind that a mapping table must have two columns, the first containing the comparison values and the second contains the desired mapping values.
ApplyMap()
The ApplyMap function is used to replace data in a field based on a previously created Mapping Table.
CountryMap:
MAPPING LOAD * INLINE [
Country, NewCountry
U.S.A., US
U.S., US
United States, US
United States of America, US
];
Data:
LOAD
ID,
Name,
ApplyMap('CountryMap', Country) as Country,
Code
FROM [lib://DataFiles/Data.xlsx]
(ooxml, embedded labels, table is Sheet1);
The first parameter in ApplyMap is the Mapping Table name in quotes. The second parameter is the field containing the data that needs to be mapped.You can add a third parameter to the ApplyMap function that serves as a default to handle cases when the value doesn’t match one in the Mapping Table. For instance:ApplyMap('CountryMap', Country, 'Rest of the world') As Country
after mapping:
MapSubstring()The MapSubstring function is used to map parts of a field, this can be used as an alternative to Replace() or PurgeChar() functions. For instance, let’s clean up these phone number values from unwanted characters:
ReplaceMap:
MAPPING LOAD * INLINE [
char, replace
")", ""
"(", ""
"\"", ""
"/", ""
"-", ""
] (delimiter is ',');
TestData:
LOAD
DataField as data,
MapSubString('ReplaceMap', DataField) as ReplacedString
INLINE [
DataField
"(415)555-1234",
"(415)543,4321",
"“510”123-4567",
"/925/999/4567"
] (delimiter is ',');
after cleaning:
MAP … USINGThe Map…Using statement works differently than the ApplyMap() function in that ApplyMap does mapping every time the field name is encountered, whereas Map… Using does mapping when the values is stored under the field name in the internal table.For instance, in the following load script, the Mapping will be applied to the Country field in Data1, however it will not be applied to Country2 field in Data2 table.That’s because Map… USING statement is only applied to the field named Country. But in Data2, the field is stored as Country2 in the internal table.
Map Country Using CountryMap;
Data1:
LOAD
ID,
Name,
Country
FROM [lib://DataFiles/Data.xlsx]
(ooxml, embedded labels, table is Sheet1);
Data2:
LOAD
ID,
Country as Country2
FROM [lib://DataFiles/Data.xlsx]
(ooxml, embedded labels, table is Sheet1);
UNMAP;
Useful functions for data cleaning
SubField()Used to extract substrings from a string field that consists of two or more parts separated by a delimeter.The arguments it takes are a Text (original string), a delimiter (character within the input text that devides the string into parts), and field_no that’s either 1 to return the first substring (left) or 2 to return the second substring (right))SubField(text, delimiter, field_no)For instance:
UserData:
LOAD * INLINE [
UserID, FullName
1, "John,Doe"
2, "Jane,Doe"
3, "Alice,Wonderland"
4, "Bob,Builder"
];
CleanedData:
LOAD
UserID,
SubField(FullName, ',', 1) as FirstName,
SubField(FullName, ',', 2) as LastName
RESIDENT UserData;
Drop Table UserData;
Len()Returns the length of the input string
Left()Returns a string of the first (left) characters of the input string, where the number of characters is determined by the second parameter.Left(text, count)
Right() Similar to left, it returns a string of the last (rightmost) characters of the input string. The second parameter determines the number of characters to be returned.
Index() The index function searches a string and returns the starting position of the nth occurrence of a provided substring. For instance:Index(‘qwerty’, ‘ty’) will return 5Index(‘qwertywy’, ‘w’, 2) will return the second occurrence of ‘w’, i.e: 7
Example 1: Using a combination of the functions above to clean up a field. Let’s take a more complex field and try to extract the first name and last name.
UserData:
LOAD * INLINE [
UserID, Object
1, "37642UI101John.Doe"
2, "98322UI101Jane.Doe"
3, "45432UI101Alice.Wonderland"
4, "32642UI101Bob.Builder"
];
CleanedData:
LOAD
UserID,
SubField(Right(Object, Len(Object) - Index(Object, 'UI101') - 4), '.', 1) as FirstName,
SubField(Right(Object, Len(Object) - Index(Object, 'UI101') - 4), '.', 2) as LastName
RESIDENT UserData;
Drop Table UserData;
after cleaning:
Example 2: Cleaning HTML in a field
Paragraphs:
LOAD * INLINE [
Paragraph_ID, Paragraph
1, "<p>This is a <strong>paragraph</strong>.</p><br><p>This is another <em>paragraph</em>.</p>"
];
// Loop through each paragrpah in the Paragraphs table
For vRow = 1 to NoOfRows('Paragraphs')
Let vID = Peek('Paragraph_ID', vRow-1, 'Paragraphs'); // Get the ID of the next record to parse
Let vtext = Peek('Paragraph', vRow-1, 'Paragraphs'); // Get the original paragraph of the next record
// Loop through each paragraph in place
Do While len(TextBetween(vtext, '<', '>')) > 0
vtext = Replace(vtext, '<br>', chr(10)); // Replace line breaks with carriage returns - improves legibility
vtext = Replace(vtext, '<' & TextBetween(vtext, '<', '>') & '>', ''); // Find groups with <> and replace them with ''
Loop;
// Store the cleaned paragraphs into a temporary table
Temp:
Load
$(vID) as Paragraph_ID,
'$(vtext)' as cleanParagraph
AutoGenerate 1;
Next vRow;
// Join the cleaned paragraphs back into the original Paragraphs table
Left Join (Paragraphs)
Load *
Resident Temp;
// Drop the temporary table
Drop Table Temp;
after cleaning:
I hope you found this post helpful!Attached you can find a QVD that contains the scripts used in the post.
Happy data cleaning!
...View More
Conditional show or hide is available in line and bar charts giving the user the ability to toggle dimensions or measures on or off in a single chart. This allows developers to customize line and bar charts and save space by using one chart to show various metrics and dimensions. Let’s look at a simple way of using this feature to show or hide lines in a line chart. In the Overall Equipment Efficiency demo found on the Demo Site, there is a line chart accompanied by buttons that are used to toggle the lines on and off in the line chart.
This is done by using variables. When each button is clicked, the respective variable is toggled from 0 to 1 or 1 to 0 depending on its current value. See the value expression in the image below.
In the measure expression in the line chart, this variable is checked to determine if the expression should be evaluated and displayed or if the measure should be set to null.
This is a perfectly good way to toggle the lines, but with the ability to use conditional show and hide in line and bar charts, this process can be simplified. First, in the measure expression, we no longer need to use an if statement which can help reduce calculation time. We can simply use our normal expression and the “Show measure if” setting, with the respective variable, to evaluate if a line should be shown in the visualization or not.
The “Show measure if” and “Show dimension if” settings evaluate the expression and will show the line if the expression evaluates to true. In my example, vShowOEE will be either 1 or 0. If it is 1, the line will be displayed. If it is 0, then it will not be displayed. We can continue to use the buttons to toggle the respective variable (from 1 to 0 and vice versa) for each line.
My example is basic, but more complex expressions can be used as well. For example, you may want to show/hide lines based on a selection or a calculated value or you may want to use some business logic to determine which dimension or measure should be displayed. The expression can be as simple or complex as needed, as long as it returns a true or false value. Keep in mind, that this show setting is optional and can be left blank. When no expression is entered, the line (or bar) is displayed.
There are a few limitations of this new feature to be aware of: 1) Custom tooltips are disabled when using a conditional dimension, 2) Time series forecasting is not available when using conditional dimensions or measures. While the “Show measure if” and the “Show dimension if” can both be used in the same chart, it is recommended that you use only one at a time. Check out Qlik Help to learn more and test this new feature out in your next line or bar chart.
Thanks,
Jennell
...View More