This space offers a variety of blogs, all written by Qlik employees. Product and non product related.
By reading the Product Innovation blog, you will learn about what's new across all of the products in our growing Qlik product portfolio.
The Support Updates blog delivers important and useful Qlik Support information about end-of-product support, new service releases, and general support topics.
This blog was created for professors and students using Qlik within academia.
Hear it from your Community Managers! The Community News blog provides updates about the Qlik Community Platform and other news and important announcements.
The Qlik Digest is your essential monthly low-down of the need-to-know product updates, events, and resources from Qlik.
The Qlik Education blog provides information about the latest updates of our courses and programs with the Qlik Education team.
We understand that when you encounter issues or have questions, you need quick and convenient access to assistance. That's why we're thrilled to introduce our latest beta enhancement: In-Product Support Chat for Qlik Cloud Administrators.
We're taking customer support to the next level by making it more accessible and user-friendly. Whether you're a seasoned pro or a new administrator, getting the help you need is easier than ever.
You'll find our Support Chat right under the Resource Center in your cloud tenant:
Whether it's troubleshooting technical issues, getting guidance on using our features, or simply asking a question, our always-learning chatbot is at your service 24/7. Or you can be easily connected to our support team to provide you with personalized assistance tailored to your specific needs, Monday-Friday.
We’re excited to embark on this journey with you. Please be sure to share feedback on your chat experience at the end of each conversation so we can continuously make improvements.
Thank you for choosing Qlik, we look forward to chatting with you!
If you have any questions about our new in-product Support, don’t hesitate to reach out.
Sincerely,
Qlik Global Support
Ever found yourself stuck with a messy pile of data that seems more like a labyrinth than a pathway to clean insights? You're not alone. Today, we're diving into the world of data cleaning in Qlik Sense to help you uncover the analytical potential hiding behind your data.
Imagine you're baking a cake. Would you eyeball the measurements of your ingredients? Probably not, unless you fancy a disaster cake. Just like one poorly measured cup of flour can ruin your entire recipe, a small data error can throw off your whole analysis. That's why, before you dive into the fun part—data analysis—you've got to make sure your key ingredient (data) is as clean and precise as possible.
It's not just about tidying up; it's about quality control. Skipped steps or overlooked errors can lead to inaccurate results that could misinform your business decisions.
These types of tables behave differently than other tables in that they are stored in a separate area of the memory and are strictly used as mapping tables when the script is run, they are then automatically dropped.
Let’s take a look at how to do this and the different statements and functions that can be used:
CountryMap:
MAPPING LOAD * INLINE [
Country, NewCountry
U.S.A., US
U.S., US
United States, US
United States of America, US
];
Keep in mind that a mapping table must have two columns, the first containing the comparison values and the second contains the desired mapping values.
The ApplyMap function is used to replace data in a field based on a previously created Mapping Table.
CountryMap:
MAPPING LOAD * INLINE [
Country, NewCountry
U.S.A., US
U.S., US
United States, US
United States of America, US
];
Data:
LOAD
ID,
Name,
ApplyMap('CountryMap', Country) as Country,
Code
FROM [lib://DataFiles/Data.xlsx]
(ooxml, embedded labels, table is Sheet1);
The first parameter in ApplyMap is the Mapping Table name in quotes. The second parameter is the field containing the data that needs to be mapped.
You can add a third parameter to the ApplyMap function that serves as a default to handle cases when the value doesn’t match one in the Mapping Table.
For instance:
ApplyMap('CountryMap', Country, 'Rest of the world') As Country
after mapping:
ReplaceMap:
MAPPING LOAD * INLINE [
char, replace
")", ""
"(", ""
"\"", ""
"/", ""
"-", ""
] (delimiter is ',');
TestData:
LOAD
DataField as data,
MapSubString('ReplaceMap', DataField) as ReplacedString
INLINE [
DataField
"(415)555-1234",
"(415)543,4321",
"“510”123-4567",
"/925/999/4567"
] (delimiter is ',');
after cleaning:
Map Country Using CountryMap;
Data1:
LOAD
ID,
Name,
Country
FROM [lib://DataFiles/Data.xlsx]
(ooxml, embedded labels, table is Sheet1);
Data2:
LOAD
ID,
Country as Country2
FROM [lib://DataFiles/Data.xlsx]
(ooxml, embedded labels, table is Sheet1);
UNMAP;
UserData:
LOAD * INLINE [
UserID, FullName
1, "John,Doe"
2, "Jane,Doe"
3, "Alice,Wonderland"
4, "Bob,Builder"
];
CleanedData:
LOAD
UserID,
SubField(FullName, ',', 1) as FirstName,
SubField(FullName, ',', 2) as LastName
RESIDENT UserData;
Drop Table UserData;
Example 1:
Using a combination of the functions above to clean up a field. Let’s take a more complex field and try to extract the first name and last name.
UserData:
LOAD * INLINE [
UserID, Object
1, "37642UI101John.Doe"
2, "98322UI101Jane.Doe"
3, "45432UI101Alice.Wonderland"
4, "32642UI101Bob.Builder"
];
CleanedData:
LOAD
UserID,
SubField(Right(Object, Len(Object) - Index(Object, 'UI101') - 4), '.', 1) as FirstName,
SubField(Right(Object, Len(Object) - Index(Object, 'UI101') - 4), '.', 2) as LastName
RESIDENT UserData;
Drop Table UserData;
after cleaning:
Example 2:
Cleaning HTML in a field
Paragraphs:
LOAD * INLINE [
Paragraph_ID, Paragraph
1, "<p>This is a <strong>paragraph</strong>.</p><br><p>This is another <em>paragraph</em>.</p>"
];
// Loop through each paragrpah in the Paragraphs table
For vRow = 1 to NoOfRows('Paragraphs')
Let vID = Peek('Paragraph_ID', vRow-1, 'Paragraphs'); // Get the ID of the next record to parse
Let vtext = Peek('Paragraph', vRow-1, 'Paragraphs'); // Get the original paragraph of the next record
// Loop through each paragraph in place
Do While len(TextBetween(vtext, '<', '>')) > 0
vtext = Replace(vtext, '<br>', chr(10)); // Replace line breaks with carriage returns - improves legibility
vtext = Replace(vtext, '<' & TextBetween(vtext, '<', '>') & '>', ''); // Find groups with <> and replace them with ''
Loop;
// Store the cleaned paragraphs into a temporary table
Temp:
Load
$(vID) as Paragraph_ID,
'$(vtext)' as cleanParagraph
AutoGenerate 1;
Next vRow;
// Join the cleaned paragraphs back into the original Paragraphs table
Left Join (Paragraphs)
Load *
Resident Temp;
// Drop the temporary table
Drop Table Temp;
after cleaning:
I hope you found this post helpful!
Attached you can find a QVD that contains the scripts used in the post.
Happy data cleaning!
Hello Qlik Users,
As previously announced, the Blendr.io platform will be sunset and all access to the platform will be discontinued on its Retirement Date, July 11, 2024. Customers with existing active subscriptions can continue renewing their subscriptions prior to the Retirement Date. However, please note that all subscriptions, including renewals, will automatically end on the Retirement Date. All existing Blendr.io platform customers have received notice of the Retirement Date as of January 11, 2023.
Due to the platform’s retirement, no new platform updates, except for patches for security issues, will be made.
However, customers with existing active subscriptions can still reach out to Qlik Support in the event of interruptions in the platform’s service or if an active automation is no longer working. Please note that Qlik Support is not available for new automations built by customers. Further, connector development work will not be provided for new block or connector requests. Please keep in mind that Qlik is not responsible for updating the platform due to changes in any third-party vendor’s API.
We appreciate your long-standing support of Qlik/Blendr.io. For Qlik Cloud customers, you can achieve the same results with our Qlik Application Automation capabilities. Explore more about this feature here.
Please contact your Qlik sales representative, the product team, or Qlik Customer Support if you have any questions.
Given Node-REDs rich ecosystem of add-on modules your imagination is the limiting factor of what can be done... One example is hybrid Sense environments where on-premise/client-managed Qlik Sense is used to process data from local systems/sources. Ctrl-Q NR can then act as the hub that orchestrate cloud app reloads starting when on-prem systems have reloaded. A reload fails? That can be detected and alerted upon. Another example is integration of IoT data with Qlik Sense. Use Node-RED to collect the data and feed it to Sense, then Ctrl-Q is used to visually manage the various Qlik Sense resources that are involved (app reloads etc).
Easy prototyping and deployment of integrations that would otherwise be difficult and/or time consuming to create.
Qlik Sense admins and developers for both client-managed and cloud Qlik Sense.
Being able to quickly test and prototype ideas is an important capability in a fast-moving IT landscape. Thanks to the low-code nature of Node-RED (on top of which Ctrl-Q NR runs) integrations between Sense and other systems and tools can often be done in minutes. Node-RED runs on most platforms, including Windows, Linux, macOS and Docker.
Access control and Governance is at the heart of any Enterprise BI software such as Qlik. Several times granular access control is the real deal breaker in the decision-making of software choice.
For example, ensuring that only the Sales Team can see the Revenue sheet,
The sales reps can see just their own sales numbers,
Sales managers can see for their team and Region heads can see for their region is critical.
WoWizer DAA empowers organizations to embrace proactive access validation and compliance within their Qlik deployments. Leveraging its cutting-edge capabilities, Wowizer DAA addresses the limitations of the existing system and brings a host of benefits to the table.
Developers, business leaders, and regulatory authorities alike can all find common ground in the value that Wowizer DAA brings to the table. Secure, compliant, and confident that's the new standard that Wowizer DAA sets for Qlik dashboard auditing.
Business leaders must ensure that development teams comprehend the requirements and deploy them correctly.
In large organizations with multiple hierarchies, there may be ambiguity regarding which rule will take precedence.
Qlik champions face challenges in verifying that what they have implemented is accurate and has been deployed in production.
Data Protection Officers require testing evidence to comply with regulations such as GDPR.
🔑 Explore Wowizer Data Access Auditor and enhance your data security today! Get started with a FREE and visit the official website for more information.
Demo
YouTube: Dive into Wowizer DAA
Website: WoWizer.com 🔗
Data Access Auditor FREE: Free Download 🆓
Github Page: https://wowizer.github.io/DAA 📚
Hi everyone,
Want to stay a step ahead of important Qlik support issues? Then sign up for our monthly webinar series where you can get first-hand insights from Qlik experts.
The Techspert Talks session from September looked at Migrating NPrinting to Qlik Cloud Reporting.
But wait, what is it exactly?
Techspert Talks is a free webinar held on a monthly basis, where you can hear directly from Qlik Techsperts on topics that are relevant to Customers and Partners today.
In this session we will cover:
Click on this link to see the presentation
Welcome to August's Qlik Data Integration newsletter. Each month, we cover one endpoint and share our top resources, best practices, release updates and upcoming webinars.
Subscribe to the Qlik Data Integration topic to be notified of future editions!
Index
As of July 31st, 2023, the following endpoints have been retired:
Source Endpoints:
Target Endpoints:
See Retirement for Specific Qlik Replicate Endpoints for details.
Find our latest knowledge base articles for Databrick endpoints.
Qlik Replicate May 2023 patches
Component/Process: Databricks (Cloud Storage)
Description: When reconnecting after recoverable error on uploading file, the last file does not get uploaded, resulting in missing data.
Component/Process: Databricks Lakehouse (Delta)
Description: When using merge, if one of the columns in a Unique Index was NULL the changes would not be applied correctly. The issue was resolved using a Feature Flag at task and server level.
Qlik Replicate August 2023 IR
Qlik Replicate November 2023 IR
An evergreen pair of articles helps you read and analyze Qlik Replicate log files:
How to analyze a Qlik Replicate log
List of the error types in Qlik Replicate
Qlik Release | Qlik Replicate / Enterprise Manager | End of Support Date | Qlik Compose | End of Support Date |
February 2021 | November 2020 SR1 | November 2022 | February 2021 | February 2023 |
May 2021 | May 2021 | May 2023 | May 2021 | May 2023 |
August 2021 | May 2021 SR1 | May 2023 | August 2021 | August 2023 |
November 2021 | November 2021 | November 2023 | November 2021 | November 2023 |
For more information, see Qlik Product Lifecycles.
Google's Universal Analytics will stop processing new hits on July 1st 2023 and makes room for Google Analytics 4.
Continue loading your data using Qlik's new GA4 connector, available for Qlik Cloud and Qlik Sense Enterprise on Windows as of today: Google Analytics 4 (Cloud) | Google Analytics 4 (on-premise)
The new Google Analytics 4 connector is also included in the Qlik Web Connector Standalone package: Google Analytics 4 (standalone)
[GA4] Introducing the next generation of Analytics, Google Analytics 4 | support.google
Prepare for the future with Google Analytics 4 | blog.google
We are happy to announce the next version of our client-managed analytics offering, Qlik Sense August 2023. This version is primarily focused on visualization improvements including a variety of new customization and styling options, and enhancements to navigation and design. Users will also appreciate new support of parquet files, providing storage savings and enhanced performance for large data sets.
In this release, you will find the following new capabilities, many of which are already available in Qlik Cloud today:
Recently, I worked with a Qlik Community member to help them understand the Qlik REST Connector with Qlik Sense and QlikView. At first it appeared simple, but then he soon realized he needed to understand a bit more about how the data came back (the response), what the pagination settings were (pages of data used to retrieve more rows) and finally how to link (join, associate) other attributes that came back from the results of multiple REST API endpoints / resources. We got it all working and the results were pleasing. Needless to say were able to perform text analytics from a barrage of Facebook comments. However, as I finalized all this in my head, I wanted to share what I've learned but in the simplest way possible. So I decided to find a very simple, publicly available RESTful service API in which I can demonstrate my findings easily. The below video presents those findings in a educational and entertaining way using the Star Wars API. Yes, that is correct, I said the Star Wars API. As a bonus, stick to the end of the video to see the Media Box Extension in action.
See this video on YouTube as well. Using the Qlik REST Connector - Pagination and Multiple JSON Schemas - YouTube
Do you know of other simple and fun, publicly available RESTful services? Share them with the Qlik Community in the comments below.
Regards,
Michael Tarallo (@mtarallo) | Twitter
Qlik
Special shout out to: Paul Hallett (@phalt_) | Twitter - for creating an awesome resource http://swapi.co/about that allowed me to easily demonstrate the Qlik Sense REST Connector.
Resources used in this video:
http://swapi.co/api/people/
http://swapi.co/api/species/
Other Resources:
If using Qlik Sense Desktop please copy .qvf file to your C:\Users\<user profile>\Documents\Qlik\Sense\Apps and refresh Qlik Sense Desktop with F5. If using Qlik Sense Enterprise Server please import .qvf into your apps using the QMC - Qlik Management Console.
Disclaimer: Star Wars, the Star Wars logo, all names and pictures of Star Wars characters, vehicles and any other Star Wars related items are registered trademarks and/or copyrights of Lucasfilm Ltd., or their respective trademark and copyright holders.
We are pleased to announce new capacity model pricing for Qlik Analytics. The new pricing model is an extension of the capacity functionality we introduced earlier this year for data integration.
We believe this pricing model aligns with modern customer expectations and will:
Today, we offer three capacity pricing tiers: Standard, Premium, and Enterprise.
You can find additional details on our website Qlik Cloud® Analytics Plans & Pricing
With the Qlik Cloud capacity model, the primary value meter is Data for Analysis or Data Moved, except for Qlik Cloud Analytics Standard where Full Users is the value meter.
See in detail what it means here: Subscription value meters
Additionally, we understand the importance of Qlik Cloud administrators to monitor their tenants' data consumption. Therefore, we are pleased to introduce:
Additional resources:
Thanks for choosing Qlik!
Qlik Global Support
A common situation when loading data into a Qlik document is that the data model contains several dates. For instance, in order data you often have one order date, one required date and one shipped date.
This means that one single order can have multiple dates; in my example one OrderDate, one RequiredDate and several ShippedDates - if the order is split into several shipments:
So, how would you link a master calendar to this?
Well, the question is incorrectly posed. You should not use one single master calendar for this. You should use several. You should create three master calendars.
The reason is that the different dates are indeed different attributes, and you don’t want to treat them as the same date. By creating several master calendars, you will enable your users to make advanced selections like “orders placed in April but delivered in June”. See more on Why You sometimes should Load a Master Table several times.
Your data model will then look like this:
But several different master calendars will not solve all problems. You can for instance not plot ordered amount and shipped amount in the same graph using a common time axis. For this you need a date that can represent all three dates – you need a Canonical Date. This is how you create it:
First you must find a table with a grain fine enough; a table where each record only has one value of each date type associated. In my example this would be the OrderLines table, since a specific order line uniquely defines all three dates. Compare this with the Orders table, where a specific order uniquely defines OrderDate and RequiredDate, but still can have several values in ShippedDate. The Orders table does not have a grain fine enough.
This table should link to a new table – a Date bridge – that lists all possible dates for each key value, i.e. a specific OrderLineID has three different canonical dates associated with it. Finally, you create a master calendar for the canonical date field.
You may need to use ApplyMap() to create this table, e.g. using the following script:
DateBridge:
Load
OrderLineID,
Applymap('OrderID2OrderDate',OrderID,Null()) as CanonicalDate,
'Order' as DateType
Resident OrderLines;
Load
OrderLineID,
Applymap('OrderID2RequiredDate',OrderID,Null()) as CanonicalDate,
'Required' as DateType
Resident OrderLines;
Load
OrderLineID,
ShippedDate as CanonicalDate,
'Shipped' as DateType
Resident OrderLines;
If you now want to make a chart comparing ordered and shipped amounts, all you need to do is to create it using a canonical calendar field as dimension, and two expressions that contain Set Analysis expressions:
Sum( {$<DateType={'Order'}>} Amount )
Sum( {$<DateType={'Shipped'}>} Amount )
The canonical calendar fields are excellent to use as dimensions in charts, but are somewhat confusing when used for selections. For this, the fields from the standard calendars are often better.
Summary:
A good alternative description of the same problem can be found here. Thank you, Rob, for inspiration and good discussions.