Skip to main content

Product

Announcements
The New Qlik Learning Experience is Here! GET STARTED
cancel
Showing results for 
Search instead for 
Did you mean: 

Analytics

Forums for Qlik Analytic solutions. Ask questions, join discussions, find solutions, and access documentation and resources.

Data Integration & Quality

Forums for Qlik Data Integration solutions. Ask questions, join discussions, find solutions, and access documentation and resources

Explore Qlik Gallery

Qlik Gallery is meant to encourage Qlikkies everywhere to share their progress – from a first Qlik app – to a favorite Qlik app – and everything in-between.

Support

Chat with us, search Knowledge, open a Qlik or Talend Case, read the latest Updates Blog, find Release Notes, and learn about our Programs.

Events & Webinars

Learn about upcoming Qlik related events, webinars and local meetups.

Groups

Join a Group that is right for you and get more out of your collaborations. Some groups are closed. Closed Groups require approval to view and participate.

About Qlik Community

Get started on Qlik Community, find How-To documents, and join general non-product related discussions.

Blogs

This space offers a variety of blogs, all written by Qlik employees. Product and non product related.

Qlik Resources

Direct links to other resources within the Qlik ecosystem. We suggest you bookmark this page.

Qlik Academic Program

Qlik gives qualified university students, educators, and researchers free Qlik software and resources to prepare students for the data-driven workplace.

Community Sitemap

Here you will find a list of all the Qlik Community forums.

Recent Blog Posts

  • qlik-productblogs.jpg
    blog

    Product Innovation

    Introducing ‘Custom Groups’ to Qlik Cloud Analytics!

    Managing users and groups just got easier. With our new Custom Groups capability, you now have more flexibility and control to handle user and group r... Show More

    Managing users and groups just got easier. With our new Custom Groups capability, you now have more flexibility and control to handle user and group relationships directly in Qlik Cloud, without needing to rely on an Identity Provider (IdP).

    Video: Introducing ‘Custom Groups’ to Qlik Cloud Analytics 

    What’s New?

    • Manage directly in Qlik Cloud

    No more back and forth with external systems. Now, you can manage users and their group memberships directly in Qlik Cloud. Everything is in one place for easy access control.

    • Create smoother group-based provisioning

    Assign users to groups in just a few clicks. Control both administrative access and feature permissions without hassle.

    • Made for your organization

    Custom Groups adapts to your unique structure, making access management easier and more aligned with the way your organization works.

    Use cases:

    • Public Sector flexibility

    Manage group memberships within Qlik Cloud and simplify integrations with third-party apps, without depending on IdP-provided groups.

    • Enterprise autonomy

    Gain full control over user and group assignments, without being tied to Active Directory or other IdP systems. Adapt faster to changing needs.

    Key Benefits:

    • More control over user roles and permissions
    • Less dependence on external systems
    • Better fit for your organization's needs

     

    Custom Groups makes managing access in Qlik Cloud simple and efficient, no matter the size or type of your organization. It saves you time, reduces complexity, and gives you full control over how users and groups are managed.

    Learn more here:

    Show Less
  • Image Not found
    blog

    Product Innovation

    Connector Factory – January and February 2025 releases

    Qlik Cloud Analytics New!   File Connector The File Connector for the Data Gateway provides a key capability to bridge on-premises file data to Qlik ... Show More

    Qlik Cloud Analytics

    New!   File Connector

    The File Connector for the Data Gateway provides a key capability to bridge on-premises file data to Qlik Cloud Analytics.  This new connector can help on-premises analytic customers transition to cloud-based analytics as it enables them to easily access and leverage existing on-premises file data, especially QVDs, in Qlik Cloud Analytics.  With familiar file access capabilities, the File Connector can also serve as a more robust replacement to the Qlik Data Transfer tool.

    Customers can use the File Connector to access network drives and file systems via the Gateway server and can preview a file using read-only access to ensure data security.  The File Connector can then load firewalled data files, of any currently supported file type, directly into Qlik Cloud.  

    The File Connector also utilizes predefined connection definitions for quick setup and supports wildcards when selecting files and folders.

    Learn more here:  Qlik Help:  File Connector | SaaS in 60 

     

    New versions of Direct Access gateway

    The Qlik Data Gateway - Direct Access allows Qlik Sense SaaS applications to securely access behind the firewall data, over a strictly outbound, encrypted, and mutually authenticated connection.

    We recently released Direct Access gateway 1.7.0 and 1.71.  1.7.0 introduced the File Connector mentioned above and 1.7.1 includes the integration of a REST Connector via the gateway.  It has the exact same capabilities as the REST Connector within Qlik Cloud, but it also provides access to sources based on REST APIs residing on-premises (behind a firewall).  We recommend that you use this REST Connector instead of the Qlik Data Transfer tool.

     

    New connectors for Qlik Answers

    Qlik Answers knowledge bases now support Google Drive and OneDrive connections as data sources. You can find more information about  creating knowledge bases here.

      

     

    Qlik Talend

    More capabilities in the Snowflake target connector

    The Snowflake target connector for data replication and data pipelines now supports configuration of advanced (additional) ODBC and JDBC connection properties. This allows users to have fine-grained control over connection definitions beyond standard parameters, including adding properties such as Role, Secondary Role, and more.

    You can find more information about these additional connection properties here.

     

    Qlik Application Automation

    New Connectors

    Qlik Answers - This Qlik-native connector that enables the creation of data sources in knowledge bases using existing data connections. It also allows users to interact with assistants by asking questions related to the data source and receive answers based on the existing data.  This blog discusses how to get started.

     

    Updated Connectors

    • Added media-files blocks to the Qlik Cloud Services & Qlik Platform Operations connectors.
    • Added a Copy Data File block to the Qlik Cloud Services connector.
    • Added input parameter to name the image generated in the Qlik chart image block.
    • Updated Slack connector to redirect URL to point to Qlik.
    • New and depreciated file-related blocks in the Slack Connector.  You can read more about the changes here.
    • Added CheckDataSource field to the Sendgrid connector.
    Show Less
  • Image Not found
    blog

    Support Updates

    Qlik Sense Enterprise for Windows - New Security Patches Available Now

    Edited December 5th: identified upgrades leading to complications with extensionsEdited December 6th: added workaround for extension complicationEdite... Show More

    Edited December 5th: identified upgrades leading to complications with extensions
    Edited December 6th: added workaround for extension complication
    Edited December 10th: added CVEs (CVE-2024-55579 and CVE-2024-55580)
    Edited December 12th, noon CET: added new patch versions and visualization and extension fix details; previous patches were removed from the download site

    Hello Qlik Users,

    New patches have been made available and have replaced the original six releases. They include the original security fixes (CVE-2024-55579 and CVE-2024-55580) as well as QB-30633 to resolve the extension and visualization defect.

    If you continue to experience issues with extensions or visualizations, see QB-30633: Visualizations and Extensions not loading after applying patch.

    Security issues in Qlik Sense Enterprise for Windows have been identified, and patches have been made available. Details can be found in Security Bulletin High Severity Security fixes for Qlik Sense Enterprise for Windows (CVE-2024-55579 and CVE-2024-55580).

    Today, we have released six service releases across the latest versions of Qlik Sense to patch the reported issue. All versions of Qlik Sense Enterprise for Windows prior to and including these releases are impacted:

    • May 2024 Patch 9
    • February 2024 Patch 13
    • November 2023 Patch 15
    • August 2023 Patch 15
    • May 2023 Patch 17
    • February 2023 Patch 14

     

    No workarounds can be provided. Customers should upgrade Qlik Sense Enterprise for Windows to a version containing fixes for these issues. November 2024 IR, released on the 26th of November, contains the fix as well

    • November 2024 Initial Release
    • May 2024 Patch 10 or 11 (both valid)
    • February 2024 Patch 14 or 15 (both valid)
    • November 2023 Patch 16 or 17 (both valid)
    • August 2023 Patch 16 or 17 (both valid)
    • May 2023 Patch 18 or 19 (both valid)
    • February 2023 Patch 15 or 16 (both valid)
    This issue only impacts Qlik Sense Enterprise for Windows. Other Qlik products including Qlik Cloud and QlikView are NOT impacted.

    All Qlik software can be downloaded from our official Qlik Download page (customer login required). Follow best practices when upgrading Qlik Sense.

    The information in this post and Security Bulletin High Severity Security fixes for Qlik Sense Enterprise for Windows (CVE-2024-55579 and CVE-2024-55580) are disclosed in accordance with our published Security and Vulnerability Policy.

     

    The Security Notice label is used to notify customers about security patches and upgrades that require a customer’s action. Please subscribe to the ‘Security Notice’ label to be notified of future updates. 

    Thank you for choosing Qlik,
    Qlik Global Support

    Show Less
  • qlik-productblogs.jpg
    blog

    Design

    Recipe for a Pareto Analysis – Revisited

    This type of question is common in all types of business intelligence. I say “type of question” since it appears in many different forms: Sometimes it... Show More

    This type of question is common in all types of business intelligence. I say “type of question” since it appears in many different forms: Sometimes it concerns products, but it can just as well concern any dimension, e.g. customer, supplier, sales person, etc. Further, here the question was about turnover, but it can just as well be e.g. number of support cases, or number of defect deliveries, etc.

     

    QV Bar chart.png

     

    It is called Pareto analysis or ABC analysis and I have already written a blog post on this topic. However, in the previous post I only explained how to create a measure which showed the Pareto class. I never showed how to create a dimension based on a Pareto classification – simply because it wasn’t possible.

     

    But now it is.

     

    But first things first. The logic for a Pareto analysis is that you first sort the products according to their sales numbers, then accumulate the numbers, and finally calculate the accumulated measure as a percentage of the total. The products contributing to the first 80% are your best, your “A” products. The next 10% are your “B” products, and the last 10% are your “C” products. In the above graph, these classes are shown as colors on the bars.

     

    The previous post shows how this can be done in a chart measure using the Above() function. However, if you use the same logic, but instead inside a sorted Aggr() function, you can achieve the same thing without relying on the chart sort order. The sorted Aggr() function is a fairly recent innovation, and you can read more about it here.

     

    The sorting is needed to calculate the proper accumulated percentages, which will give you the Pareto classes. So if you want to classify your products, the new expression to use is

     

    =Aggr(
        If(Rangesum(Above(Sum({1} Sales)/Sum({1} total Sales),1,RowNo()))<0.8, 'A',
            If(Rangesum(Above(Sum({1} Sales)/Sum({1} total Sales),1,RowNo()))<0.9, 'B',
                'C')),
        (Product,(=Sum({1} Sales),Desc))
        )

     

    The first parameter of the Aggr() – the nested If()-functions – is in principle the same as the measure in the previous post. Look there for an explanation.

     

    The second parameter of the Aggr(), the inner dimension, contains the magic of the sorted Aggr():

     

        (Product,(=Sum({1} Sales),Desc))

     

    This structured parameter specifies that the field Product should be used as dimension, and its values should be sorted descending according to Sum({1} Sales). Note the equals sign. This is necessary if you want to sort by expression.

     

    So the Products inside the Aggr() will be sorted descending, and for each Product the accumulated relative sales in percent will be calculated, which in turn is used to determine the Pareto classes.

     

    The set analysis {1} is necessary if you want the classification to be independent of the made selection. Without it, the classification will change every time the selection changes. But perhaps a better alternative is to use {$<Product= >}. Then a selection in Product (or in the Pareto class itself) will not affect the classification, but all other selections will.

     

    The expression can be used either as dimension in a chart, or in a list box. Below I have used the Pareto class as first dimension in a pivot table.

     

    QS Pivot.png

     

    If you use this expression in a list box, you can directly select the Pareto class you want to look at.

     

    QS List box.png

     

    The other measures in the pivot table are the exclusive and inclusive accumulated relative sales, respectively. I.e. the lower and upper bounds of the product sales share:

     

    Exclusive accumulated relative sales (lower bound):

     

    =Min(Aggr(
        Rangesum(Above(Sum({1} Sales)/Sum({1} total Sales),1,RowNo())),
        (Product,(=Sum({1} Sales),Desc))
      ))

     

    Inclusive accumulated relative sales (upper bound):

     

    =Max(Aggr(
        Rangesum(Above(Sum({1} Sales)/Sum({1} total Sales),0,RowNo())),
        (Product,(=Sum({1} Sales),Desc))
      ))

     

    Good luck in creating your Pareto dimension!

     

    HIC

     

    Further reading related to this topic:

    The sortable Aggr function is finally here!

    Recipe for a Pareto Analysis

    Recipe for an ABC Analysis

    Show Less
  • Image Not found
    blog

    Qlik Education

    Get Ready. A New Qlik Learning Experience is Coming!

    We are thrilled to announce that the new Qlik Learning will launch on February 17, 2025. What is coming?  A new Qlik Learning experience, consolidati... Show More

    We are thrilled to announce that the new Qlik Learning will launch on February 17, 2025.

    What is coming? 

    • A new Qlik Learning experience, consolidating the current Qlik Continuous Classroom and Talend Academy into one integrated learning platform.    
    • An unlimited Qlik Learning subscription designed to energize your learning experience, accelerate your success, and help you grow your skills and expertise throughout your career.   

    What can you expect with the new Qlik Learning experience:  

    • Gain hands-on Qlik Analytics and Data Integration experience in one spot with a single sign-on. 
    • Discover easy-to-follow learning journeys that are designed for you.   
    • Attend live webinars delivered on a regular basis on a variety of advanced topics.   
    • Access an extensive library of Qlik Learning Shorts. These short videos show you exactly how and why to perform common tasks, so you get it right the first time, every time.  
    • Use Qlik Certification resources and earn digital badges to help further your career and set new records for continuous learning.   

    What do we recommend to get prepared? 

    While Qlik is excited about this transition, there are actions we recommend: 

    There will be downtime while we prepare the new Qlik Learning experience for you. Downtime window starts at 8:00am ET on February 14 and ends on February 16 at 7:00pm ET. During this time, access to the current platform will be unavailable.

    Qlik Continuous Classroom users: 

    If you are in the middle of completing a course, we recommend you complete it ahead of the new Qlik Learning launch, so your completion data is transferred, and progress is not lost. 

    Any Achievement or qualifications badges for the 2019, 2020, and 2021 Business Analyst or Data Architect will not be migrated into the new Qlik Learning, so we recommend downloading and sharing these using your Badgr backpack; also see the Sharing Badges on Social Sites document. 

    Check out the Qlik Learning FAQ we’ve prepared for you. 

    Talend Academy users: 

    You will log into the new Qlik Learning with your Qlik account. Don’t have an account? Sign up for a Qlik Account ahead of the launch using the same email address you use on Talend Academy. 

    Check out the Qlik Learning FAQ we’ve prepared for you. 

    Reach out to Qlik Learning at education@qlik.com if you have any questions. We greatly appreciate your patience as we work to enrich your learning experience.  

    Stay tuned for exciting learning updates! 

    *Important note: While we are confident on the February 17, 2025, launch date, please note there is always a possibility of adjustments. We will keep you informed promptly should any changes occur.

    Show Less
  • Image Not found
    blog

    Qlik Education

    The New Qlik Learning Is Here!

    We are thrilled to announce that the new Qlik Learning is now live and ready for you! It is a single, integrated learning platform designed to enhance... Show More

    We are thrilled to announce that the new Qlik Learning is now live and ready for you! It is a single, integrated learning platform designed to enhance your learning experience and help you get the most out of Qlik.

    An Unlimited Qlik Learning subscription is designed to energize your learning experience, accelerate your success, and help you grow your skills and expertise throughout your career.

    What can you expect with the new Qlik Learning experience?

    • Gain hands-on analytics and data integration experience in one spot with a single sign-on.
    • Discover easy-to-follow learning journeys that are designed for you.
    • Earn badges to help further your career and set new records for continuous learning.
    • Attend live webinars delivered by our expert instructors on a variety of advanced topics.
    • Access an extensive library of Qlik Learning Shorts.
    • Pursue the ultimate recognition of your expertise: Qlik Certification! Access the resources and practice exams to help you prepare.

    How do you get started?

    To get started, simply log in to Qlik Learning with your Qlik account (Don't have an account? Sign up) and complete the short Getting Started course. This will unlock the full range of opportunities available to you. Additionally, after finishing the course, you'll earn a digital badge that you can showcase within your network!

    Check out the Qlik Learning FAQs we’ve prepared for you, and reach out to Qlik Learning at education@qlik.com if you have any additional questions.

    We can’t wait to hear about your experiences and what you love most about the new Qlik Learning!

    Show Less
  • Image Not found
    blog

    Support Updates

    Qlik Application Automation and Slack: Breaking changes March 11, 2025

    Starting March 11th, 2025, Slack will enforce changes in their APIs affecting file uploads. To accommodate these breaking changes, we have introduced ... Show More

    Starting March 11th, 2025, Slack will enforce changes in their APIs affecting file uploads. To accommodate these breaking changes, we have introduced new blocks in the Slack connector for Qlik Application Automation.

    What blocks are affected?

    • Upload File To Channel (new version)
    • Send Text Based file (new version)
    • Send Binary File (deprecated)

    What exactly is changing for Qlik Application Automation?

    The Send Binary File block will be deprecated. Instead, use the Upload File to Channel block to upload binary files. If you still want to send a base64 encoded string, use the Send Text Based File block and configure the encoding parameter to base64.

    The Upload File To Channel block and Send Text Based File block need to be updated to a new version. To perform this update, replace existing blocks with new blocks by dragging the blocks from the block library.

    What will I need to do to mitigate this?

    Any automation using affected blocks needs to be updated. 

    See Breaking changes for file support in the Slack connector: new blocks introduced for steps and details.

     

    Thank you for choosing Qlik,
    Qlik Support

     

    Show Less
  • Image Not found
    blog

    Qlik Academic Program

    India update: Data and AI related roles will dominate in 2025

    Indian IT hiring landscape is at a pivotal juncture as it transitions from a year of decline towards a more hopeful future. The focus on specialised s... Show More

    Indian IT hiring landscape is at a pivotal juncture as it transitions from a year of decline towards a more hopeful future. The focus on specialised skills, particularly in AI and data science, combined with geographical shifts towards Tier 2 cities, indicates a transformation within the sector. While the IT hiring landscape in India in 2024 was marked by delayed onboarding and a decline in overall hiring activity, the outlook for 2025 appears promising with expectations of recovery and growth fuelled by improvements in economic conditions and technological advancements.

     
    If you are a student or an educator, looking to get skilled in data analytics, leverage resources of the 
    Qlik Academic Program and get training, software, qualifications and certifications completely free!
    Show Less
  • Image Not found
    blog

    Product Innovation

    Qlik Sense November 2024 (Client-Managed) now available!

    Visualizations & Dashboards   Navigation Enhancements In July 2024 we introduced some major navigation enhancements across the platform, which are inc... Show More

    Visualizations & Dashboards

     

    Navigation Enhancements

    In July 2024 we introduced some major navigation enhancements across the platform, which are inclusive of both new features and existing enhancements. All of these components combined now allow for a more intuitive and fluid experience for everyone. The below bulleted items are the positively impacted areas by this release:

    • Navigation Menu
    • Sheet Navigation
    • Sheet Grouping
    • UI Settings

    QlikProductUpdates_0-1732573619684.png

     

    Our teams have been hard at work making robust accessible assets and resources to cover this topic in depth. For more information, whether high-level or the nitty-gritty, please check out the following:

    Pivot Table Improvements

    This is one of those times that we condone messing with a classic such as the Pivot Table, especially when you improve it with features just as classic, you know? Check out the new additions below!:

    • Add an image to a cell within the pivot table via URL
    • Copy a cell value
    • Export to image and PDF  
    • Monitor and snapshot your Pivot Tables
    • Ability to Subscribe!

    Cyclic Dimensions Improvements (based off customer feedback!)

    • Ability to now set an active field within the cyclic
    • Newly implemented expression based labels for both drill down dimensions and cyclic dimensions!

    Straight Table - Enhancements

    • Text styling by expression set the dimension or measure to any combination of bold, italic, underline, and strikethrough using a second expression with tags <b>, <i>, <u> and <s>.
    • Modifiers turn your measure into an accumulation, a moving average, a difference, or a relative number with a single drop down.
    • Image in cell via URL
    • Cell font styling

    QlikProductUpdates_1-1732573619930.png

     

    Improvements to Selection Bar

    • Custom themes can now be utilized to style the selection bar
    • Updates to how labels work with selection:
      • If the user has provided a label to a master dimension, then that label will be used rather than the field name in the underlying data model. This will improve usability and make it easier to create multilingual applications.

    QlikProductUpdates_2-1732573619886.png

     

    Combo Chart Updates

    • You now have the option to display labels on stacked measure segments
    • This new improvement was a special request from YOU! We are happy to announce we have now implemented adding labels to markers and a setting to toggle the grid.

    QlikProductUpdates_3-1732573619804.png

     

    Tab Container Changes + Bundled Charts

    • Tab Container: Please note that we have made changes to our container visualization within the viz bundle and you may now find it under the nomenclature: "Tab Container". The old container has moved over to the dashboard bundle and will shortly be deprecated.
    • Bundled Charts: All remaining dashboard and visualization bundle charts receive general styling!

     

    Data Prep

     

    Improved UX for Script Editing

    • The script editing experience has been improved in the Data load editor and now offers the same functionality as the Script editor. The editor now includes a data preview feature, allowing users to get better insight when writing script to load data.
    • Visual wizards have been added to easily write Store statements or include QVS files. The editor also adds usability improvements such as resizable panels and the possibility to preview the content of the included QVS files.

    New Functionality for Search & Replace

    • Users can now search and replace text within the expression editor in Qlik Sense apps, just as they can already do in the script editor and data load editor. This includes a "replace all" option, allowing for quick bulk edits within a single complex expression.
    • The usability of the expression editor has also been improved by adding a confirmation dialog when closing with unsaved changes.

    New Ability for Autocomplete Hints

    • Users can now enable or disable autocomplete hints when writing Qlik script in the script editor, data load editor, and expression editor.

     

    *Important Notice* 

    1) Attention Android Mobile Users

    If you are using an Android mobile device to access Qlik Sense through the mobile app, please do not upgrade to the November 2024 release just yet. The Android mobile client requires additional updates that weren’t ready in time for this release.

    Important Clarification -> Android users can still access Qlik Sense via a mobile web browser without any issues. This limitation only affects the Qlik Sense mobile app on Android.

    This update does not impact:

    • Users accessing Qlik Sense on iOS mobile devices.
    • Users accessing Qlik Sense on laptops, desktops, or other non-mobile platforms.
    • Android users using a web browser to connect to Qlik Sense.


    Only customers using Android mobile devices via the Qlik Sense mobile app are advised to delay upgrading until we release a patch to address this.

    We apologize for any inconvenience this may cause and appreciate your understanding. Our team is working diligently to complete the necessary updates, and we will notify you as soon as the patch is available.

     

    2) Add-on Upgrade Requirements

    View Support Updates for details on add-ons that must be upgraded, if you upgrade to Qlik Sense Enterprise on Windows November 2024.

    Thank you for your continued support! For questions or assistance, please reach out to our support team.

     

    Show Less
  • Image Not found
    blog

    Design

    DIY your Qlik Answers Experience with the new KB and Assistants APIs

    Qlik Answers transforms unstructured data into clear, AI-powered insights. Today, I'll show you how to integrate Qlik Answers directly into your web a... Show More

    Qlik Answers transforms unstructured data into clear, AI-powered insights. Today, I'll show you how to integrate Qlik Answers directly into your web app using the newly released Knowledgebases API and Assistants API.

    In this blog, we'll build a custom Football chat assistant from scratch powered by Qlik Answers.

    We’ll leverage the Assistants API to power real-time Q&A while the knowledge base is already set up in Qlik Sense.

    For those of you who prefer a ready-made solution, you can quickly embed the native Qlik Answers UI using qlik-embed:

     

    <qlik-embed
      ui="ai/assistant"
      assistant-id="<assistant-id>"
    ></qlik-embed>

     

    You can explore the ai/assistant parameters (and other UIs available in qlik-embed) on qlik.dev, or take a look at some of my previous blog posts here and here.

    For full documentation on the Knowledgebases API and Assistants API, visit qlik.dev/apis/rest/assistants/ and qlik.dev/apis/rest/knowledgebases/.

    Let’s dive in and see how you can take control of your Qlik Answers UI experience!

    What Are Qlik Answers Assistants and Knowledgebases?

    Before we start building our DIY solution, here’s a quick refresher:

    • Knowledgebases: Collections of individual data sources (like HTML, DOCX, TXT, PDFs) that power your Qlik Answers. (In our case, we built the KB in Qlik Sense!)

    • Assistants: The chat interface that interacts with users using retrieval-augmented generation (RAG). With generative AI in the mix, Qlik Answers delivers reliable, linked answers that help drive decision-making.

    DIY the Qlik Answers Experience

    Step 1: Get your data ready

    Since we already created our knowledge base directly in Qlik Sense, we skip the Knowledgebases API. If you’d like to build one from scratch, check out the knowledgebases API documentation.

    Screenshot 2025-02-14 172125.png

    Step 2: Configure your assistant

    With your knowledge base set, you create your assistant using the Assistants API. This is where the magic happens: you can manage conversation starters, customize follow-ups, and more. Visit the assistants API docs on qlik.dev. to learn more

    Step 3: Build Your Custom UI

    Now, let’s look at our custom chat UI code. We'll built a simple football-themed chat interface that lets users ask questions related to the NFL. The assistant’s answers stream in seamlessly to the interface.

    Screenshot 2025-02-14 171644.png

    HTML:

     

    <!doctype html>
    <html lang="en">
      <head>
        <meta charset="UTF-8" />
        <meta name="viewport" content="width=device-width, initial-scale=1.0" />
        <title>Football Assistant</title>
        <link rel="stylesheet" href="styles.css" />
      </head>
      <body>
        <div class="chat-container">
          <div class="chat-header">
            <h4>Let's talk Football</h4>
            <span class="header-span">You ask, Qlik answers.</span>
          </div>
    
          <div class="chat-body" id="chat-body">
            <div class="message assistant">
              <div class="bubble">
                <p>Hey there, champ! Ask me anything.</p>
              </div>
            </div>
          </div>
          <div class="chat-footer">
            <input
              type="text"
              id="chat-input"
              placeholder="Type your Football related question..."
            />
            <button id="send-btn">Send</button>
          </div>
        </div>
        <script src="scripts.js"></script>
      </body>
    </html>

     

    Frontend JS:

     

    document.addEventListener("DOMContentLoaded", () => {
      const chatBody = document.getElementById("chat-body");
      const chatInput = document.getElementById("chat-input");
      const sendButton = document.getElementById("send-btn");
    
      // Append a user message immediately
      function appendUserMessage(message) {
        const messageDiv = document.createElement("div");
        messageDiv.classList.add("message", "user");
        const bubbleDiv = document.createElement("div");
        bubbleDiv.classList.add("bubble");
        bubbleDiv.innerHTML = `<p>${message}</p>`;
        messageDiv.appendChild(bubbleDiv);
        chatBody.appendChild(messageDiv);
        chatBody.scrollTop = chatBody.scrollHeight;
      }
    
      // Create an assistant bubble that we update with streaming text
      function createAssistantBubble() {
        const messageDiv = document.createElement("div");
        messageDiv.classList.add("message", "assistant");
        const bubbleDiv = document.createElement("div");
        bubbleDiv.classList.add("bubble");
        bubbleDiv.innerHTML = "<p></p>";
        messageDiv.appendChild(bubbleDiv);
        chatBody.appendChild(messageDiv);
        chatBody.scrollTop = chatBody.scrollHeight;
        return bubbleDiv.querySelector("p");
      }
    
      // Send the question to the backend and stream the answer
      function sendQuestion() {
        const question = chatInput.value.trim();
        if (!question) return;
    
        // Append the user's message
        appendUserMessage(question);
        chatInput.value = "";
    
        // Create an assistant bubble for the answer
        const assistantTextElement = createAssistantBubble();
    
        // Open a connection to stream the answer
        const eventSource = new EventSource(
          `/stream-answers?question=${encodeURIComponent(question)}`
        );
    
        eventSource.onmessage = function (event) {
          if (event.data === "[DONE]") {
            eventSource.close();
          } else {
            assistantTextElement.innerHTML += event.data;
            chatBody.scrollTop = chatBody.scrollHeight;
          }
        };
    
        eventSource.onerror = function (event) {
          console.error("EventSource error:", event);
          eventSource.close();
          assistantTextElement.innerHTML += " [Error receiving stream]";
        };
      }
    
      sendButton.addEventListener("click", sendQuestion);
      chatInput.addEventListener("keydown", (event) => {
        if (event.key === "Enter") {
          event.preventDefault();
          sendQuestion();
        }
      });
    });

     

    Backend node.js script:

     

    import express from "express";
    import fetch from "node-fetch";
    import path from "path";
    import { fileURLToPath } from "url";
    
    // Setup __dirname for ES modules
    const __filename = fileURLToPath(import.meta.url);
    const __dirname = path.dirname(__filename);
    
    // Define port and initialize Express app
    const PORT = process.env.PORT || 3000;
    const app = express();
    app.use(express.static("public"));
    app.use(express.json());
    
    // Serve the frontend
    app.get("/", (req, res) => {
      res.sendFile(path.join(__dirname, "public", "index.html"));
    });
    
    // Endpoint to stream Qlik Answers output
    app.get("/stream-answers", async (req, res) => {
      const question = req.query.question;
      if (!question) {
        res.status(400).send("No question provided");
        return;
      }
    
      // Set headers for streaming response
      res.writeHead(200, {
        "Content-Type": "text/event-stream",
        "Cache-Control": "no-cache",
        Connection: "keep-alive",
      });
    
      const assistantId = "b82ae7a9-9911-4830-a4f3-f433e88496d2";
      const baseUrl = "https://sense-demo.us.qlikcloud.com/api/v1/assistants/";
      const bearerToken = process.env["apiKey"];
    
      try {
        // Create a new conversation thread
        const createThreadUrl = `${baseUrl}${assistantId}/threads`;
        const threadResponse = await fetch(createThreadUrl, {
          method: "POST",
          headers: {
            "Content-Type": "application/json",
            Authorization: `Bearer ${bearerToken}`,
          },
          body: JSON.stringify({
            name: `Conversation for question: ${question}`,
          }),
        });
    
        if (!threadResponse.ok) {
          const errorData = await threadResponse.text();
          res.write(`data: ${JSON.stringify({ error: errorData })}\n\n`);
          res.end();
          return;
        }
    
        const threadData = await threadResponse.json();
        const threadId = threadData.id;
    
        // Invoke the Qlik Answers streaming endpoint
        const streamUrl = `${baseUrl}${assistantId}/threads/${threadId}/actions/stream`;
        const invokeResponse = await fetch(streamUrl, {
          method: "POST",
          headers: {
            "Content-Type": "application/json",
            Authorization: `Bearer ${bearerToken}`,
          },
          body: JSON.stringify({
            input: {
              prompt: question,
              promptType: "thread",
              includeText: true,
            },
          }),
        });
    
        if (!invokeResponse.ok) {
          const errorData = await invokeResponse.text();
          res.write(`data: ${JSON.stringify({ error: errorData })}\n\n`);
          res.end();
          return;
        }
    
        // Process and stream the response text
        const decoder = new TextDecoder();
        for await (const chunk of invokeResponse.body) {
          let textChunk = decoder.decode(chunk);
          let parts = textChunk.split(/(?<=\})(?=\{)/);
          for (const part of parts) {
            let trimmedPart = part.trim();
            if (!trimmedPart) continue;
            try {
              const parsed = JSON.parse(trimmedPart);
              if (parsed.output && parsed.output.trim() !== "") {
                res.write(`data: ${parsed.output}\n\n`);
              }
            } catch (e) {
              if (trimmedPart && !trimmedPart.startsWith('{"sources"')) {
                res.write(`data: ${trimmedPart}\n\n`);
              }
            }
          }
        }
        res.write("data: [DONE]\n\n");
        res.end();
      } catch (error) {
        res.write(`data: ${JSON.stringify({ error: error.message })}\n\n`);
        res.end();
      }
    });
    
    // Start the backend server
    app.listen(PORT, () => {
      console.log(`Backend running on port ${PORT}`);
    });

     

    Breaking It Down

    Okay, that was a lot of code! Let’s break it down into bite-sized pieces so you can see exactly how our custom Qlik Answers chat interface works.

    1. The HTML

    Our index.html creates a custom chat UI. It sets up:

    • A chat body where messages appear (initially with a friendly greeting from the assistant).
    • A chat footer with an input field and a send button for users to type their questions.

    2. The Frontend JavaScript (scripts.js)

    This script handles the user interaction:

    • Appending messages: When you type a question and hit send (or press Enter), your message is added to the chat window.

    • Creating chat bubbles: It creates separate message bubbles for you (the user) and the assistant.

    • Streaming the answer: It opens a connection to our backend so that as soon as the assistant’s response is ready, it streams into the assistant’s bubble. This gives you a live, real-time feel without any manual “typing” effect.

    3. The Node.js Backend (index.js)

    Our backend does the heavy lifting:

    • Creating a conversation thread: It uses the Assistants API to start a new thread for each question.

    • Invoking the streaming endpoint: It then sends your question to Qlik Answers and streams the response back.

    • Processing the stream: As chunks of text come in, the backend cleans them up—splitting any concatenated JSON and only sending the useful text to the frontend.

    • Closing the stream: Once the complete answer is sent, it signals the end so your chat bubble doesn’t wait indefinitely.

    4. How It All Connects

    When you send a question:

    • Your message is displayed immediately in your custom chat bubble.

    • The backend creates a thread and requests an answer from Qlik Answers.

    • The response is streamed back to your UI in real time, making it look like the assistant is typing out the answer as it arrives.

    P.S: this is just a simple example to introduce you to the new Answers APIs and show you how to get started using them, you'll need to double check limitations and adhere to best practices when using the APIs in a production environment. 

    Screenshot 2025-02-14 173049.png

    You can find the full code here:
    https://replit.com/@ouadielimouni/QA-Test-APIs#public/index.html

     

    Happy coding - and, Go Birds 🦅!

    Show Less
  • qlik-productblogs.jpg
    blog

    Product Innovation

    Qlik Talend Cloud Introduces Cross Project Pipelines and AI Processor for Snowfl...

    I’m thrilled to write this installment of Qlik’s innovation blog because the new Qlik Talend Cloud features I’ve chosen to highlight are two of the ca... Show More

    I’m thrilled to write this installment of Qlik’s innovation blog because the new Qlik Talend Cloud features I’ve chosen to highlight are two of the capabilities I’ve been testing over the past few weeks. So, without any further ado let's dive into these exciting new capabilities!

    Cross Project Pipelines

    Since it’s inception, Qlik Talend Cloud pipelines have offered straightforward design metaphor. Often, you’d create a pipeline for a single data source that continually landed, merged and transformed data changes into a single target, such as a cloud data warehouse or lake. As time progressed the ability to add multiple data sources to a pipeline was introduced, and dedicated replication tasks with multiple targets followed a short time later.

    Qlik Talend Cloud Data PipelinesQlik Talend Cloud Data Pipelines

    However, many customers gave feedback that they’d like pipelines to be more modular, especially as projects became bigger and more complex. Modularity would not only increase component reusability, but also enable pipelines to be segregated by business domain. In addition, pipeline development would be more flexible while adhering to the best data-design practices.

    Well, I’m happy to announce that “Cross Project Pipelines” are now generally available in all tenants. You can split complex pipelines consisting of multiple ingestion and transformation tasks into components that can be reused by other projects providing greater design flexibility and simplified pipeline management. In addition, Cross Project Pipelines can be segregated by data domain to encourage Business Domain Data Product or Data Mesh design principles.Cross Project PipelineCross Project Pipeline

    AI Processor Snowflake Support

    At the end of 2024, we released an AI processor that allowed you call native Databricks AI functions in a Transformation Flow without the need to hand code SQL. Databricks AI functions are a set of built-in SQL functions that allow you to apply AI directly to your data within SQL queries. This means you can use powerful AI models for tasks like sentiment analysis, text generation, and more, all from your Qlik Talend Cloud pipelines. If you can’t remember that far back then checkout this Qlik community blog post “Inject AI into your Databricks Qlik Talend Cloud Pipeline

    While many of our Databricks customers were overjoyed, the Snowflake proponents felt very left out, regularly commenting that Snowflake Cortex offered similar features too. Those comments were frequently followed by the question of “When will Qlik’s AI processor support Snowflake too?” Once again, I’m happy to say we’ve listened, and now the AI processor also supports Snowflake Cortex AI functions as well! The details of how to use Snowflake Cortex go beyond the scope of this blog post but stay tuned because a detailed article and demo of this feature will be published shortly. Until then, look at the screenshot below to see the AI processor in action and follow the link for more information about Snowflake Cortex LLM functions.

    Transformation Flow and AI ProcessorTransformation Flow and AI Processor

    Wrap Up and 2025 Roadmap Webinar

    Well there you have it. Two great new features that expand the usefulness and uses of Qlik Talend Cloud, but it doesn’t stop there. If you’re curious about what other innovations, enhancements, and improvements are coming to the Qlik platform in 2025 then join our Qlik Insider Webinar - Roadmap Edition that’s taking place on February 26th. Follow this link and register today!

     

     

     

     

     

     

     

     
     
     

     

     
    Show Less
  • Image Not found
    blog

    Support Updates

    Techspert Talks - Advanced Qlik Sense System Monitoring

    Hi everyone, Want to stay a step ahead of important Qlik support issues? Then sign up for our monthly webinar series  where you can get first-hand ins... Show More

    Hi everyone,
    Want to stay a step ahead of important Qlik support issues? Then sign up for our monthly webinar series  where you can get first-hand insights from Qlik experts.

    Thursday, February 27 Qlik will host another Techspert Talks session and this time we are looking at Advanced Qlik Sense System Monitoring.

    But wait, what is it exactly?
    Techspert Talks is a free webinar held on a monthly basis, where you can hear directly from Qlik Techsperts on topics that are relevant to Customers and Partners today.

    In this session we will cover:

    • What performance metrics can be measured
    • How to setup the Zabbix tool
    • Integration and extendibility

     

    Choose the webinar time that's best for you.


    The webinar is hosted using ON24 in English and will last 30 minutes plus time for Q&A.
    Hope to see you there!!

    Community400x200.png

     

     

     

     

     

     

     

     

     
    Show Less
  • Image Not found
    blog

    Design

    Another ValueList Use Case

    Several years ago, I blogged about how creating a synthetic dimension using ValueList allowed us to color dimensions in a chart. ValueList is commonly... Show More

    Several years ago, I blogged about how creating a synthetic dimension using ValueList allowed us to color dimensions in a chart. ValueList is commonly used where there is not a dimension in the data model to use, thus creating a synthetic one with ValueList. You can read more about ValueList in myprevious blog post. In this blog post, I am going to share how I used ValueList to handle omitted dimension values in a chart.

    I recently ran into a scenario when creating visualizations based on survey data. In the survey, the participant was asked for their age as well as their age group. The ages were grouped into the following buckets:

    • Under 18
    • 18-24
    • 25-34
    • 35-44
    • 45-54
    • 55-69
    • 70+

     

    Once I loaded the data, I realized that there were not participants for all the age groups, so my chart looked like the bar chart below. There was a bar and value for only the age groups that the participants fit in.

    before.png

    While I could leave the chart like this, I wanted to show all the age group buckets in the chart so that it was evident that there were no participants (0%) in the other age group buckets. In this example, the four age groups were consecutive, so it did not look odd to leave the chart as is but imagine if there were no participants in the 45-54 age bucket. The chart may look odd with the gap between 44 and 55.

    gap.png

    I explored various ways to handle this. One way was to add rows to the respective table for the missing age group. This worked fine but I was not a fan of adding rows to the survey table that were not related to a specific participant. The option that I settled on was using ValueList to add the omitted age groups. While this option works well, it can lead to lengthy expressions for the measures. In this example, there were only seven age group buckets so it was manageable but if you had many dimensions values then it may not be the best option.

    To update the bar chart using ValueList, I changed the dimension from

    dimension before.png

    To

    dimension after.png

     

    Then I changed the measure from

    measure before.png

     

    To

    measure after.png

     

    Using ValueList in the dimension created a synthetic dimension with each age group option that was included in the survey. Now I will see all the age buckets in the chart even if there were no participants that fell in the age group bucket. Since I am using ValueList for the dimension, I need to update the measure to use it as well. This is where a single line measure can become a lengthier measure because I need to create a measure for every value in the synthetic dimension, thus the nested if statement above. The result looks like this:

    final.png

    There are no gaps in the age buckets, and we can see all the age bucket options that were presented in the survey. I prefer this chart over the first bar chart I shared because I have a better understanding of the survey responses presented to the participants as well as the response they provided. I would be interested in hearing how others have handled similar scenarios.

    Thanks,

    Jennell

     

     

     

    Show Less
  • qlik-productblogs.jpg
    blog

    Product Innovation

    Qlik Talend Cloud Packaging and Pricing - A primer

    Packaging: Qlik Talend Cloud brings together Qlik and Talend’s best of breed capabilities related to Data Integration, Quality and Governance into 4 ... Show More

    samathalye_1-1733150274936.png

    Packaging: Qlik Talend Cloud brings together Qlik and Talend’s best of breed capabilities related to Data Integration, Quality and Governance into 4 simple use-case centric editions. These editions are designed to simplify the process of choosing the right solution and help customers focus on using the solution quickly to drive business results. With its broad range of best-in-class capabilities, Qlik Talend Cloud supports customer scenarios across every level of technical maturity; from ingestion of data in batches from SaaS applications into a cloud warehouse, to developing sophisticated data products with robust governance and everything in between. Depending on your business and technological needs at the moment, you can choose an edition today and smoothly transition to more advanced editions as needed over time without disruptions.

     

    Pricing Model: A key facet of Qlik Talend Cloud’s development was the introduction of a usage capacity-based pricing model for all capabilities within the Qlik Talend Cloud portfolio. This pricing model enables organizations to more tightly align their investment in Qlik to the value that they realize in the solution. Capacity bands have been defined to provide a specific level of usage capacity for each edition. Customers can start with an initial capacity commitment and as they ramp up the use of the solution, flexibly add more capacity bands to meet their business needs. There also is a structured pricing incentive for higher levels of capacity commitment to support deployments at scale.

    samathalye_3-1733150364768.png

    Pricing Metrics: To help customers plan their capacity commitment, Qlik Talend Cloud uses two simple types of capacity bands. One for Data movement and basic transformation, and another for more sophisticated data integration, quality needs. Lets look at them one by one.

    Data Moved: This is a measure of total volume of data moved (in GB) in a given month.

    • The Initial full data load of new tables or files is free and not counted towards the metric. Any changes to the data over time does accrue towards the metric.
    • The data volume is calculated by multiplying the number of rows in dataset by the row size. Rose size is estimated by mapping data types to standardized Qlik Cloud data types and using default data sizes for each type. For a full list of default data sizes please refer to this documentation.

    Job Executions: This is a measure of the total number of times each job (Artifact ID) is executed in a given month. (this metric is relevant for advanced data integration, quality needs)

    • Batch job executions are counted upon successful conclusion in the given month
    • Always-on job executions are counted once in each month that the job was running

    Job Duration: This is a measure of total time taken (in hours) for executions of all jobs in a given month. (this metric is relevant for advanced data integration, quality needs)

    • Batch job duration is counted upon successful conclusion in the given month
    • For Always-on jobs, the actual duration is converted to a ‘chargeable duration’ using a progressive scale. Higher the number of always-on jobs, lower the effective chargeable duration.

     

    samathalye_4-1733150422154.png

     

    Self-service usage Dashboards: In order to enable customers to analyze their usage, Qlik provides an intuitive and interactive self-service usage dashboard that provides granular insights into usage trends and underlying drivers. Customers frequently use this information for internal cost-allocation across different divisions or departments.

     Estimating capacity needs: Please reach out to your Qlik account team who can work with you to understand the workload use inhouse capacity estimation tools to estimate your capacity needs for Data movement as well as Job executions and duration.

    See below for a short video that summarizes the Qlik Talend Cloud packaging and pricing model. More details can be found on our pricing page here.

    Show Less
  • Image Not found
    blog

    Product Innovation

    Introducing Qlik Trust Score: Elevating Data Trustworthiness in Qlik Talend Clou...

    In today’s data-driven world, trust in data isn’t just important—it’s essential. Organizations depend on high-quality data to drive informed decisions... Show More

    In today’s data-driven world, trust in data isn’t just important—it’s essential. Organizations depend on high-quality data to drive informed decisions, fuel innovation, and maintain a competitive edge. But data quality isn’t one-size-fits-all. In customer service, a missing address might be acceptable if the primary contact method is valid. However, an incomplete or incorrect address can lead to payment failures and operational inefficiencies in billing and invoicing. 

    To evaluate trust in data, organizations need a metric-driven, objective measure that can be tuned to meet their specific definitions of data quality. A flexible and transparent approach ensures organizations can adapt trust assessments to their unique operational data quality needs. 

    Qlik Trust Score in Qlik Talend Cloud evaluates a dataset’s trustworthiness by aggregating various data quality dimensions. It provides a holistic view to help organizations identify gaps and prioritize improvements, with a numeric score (ranging from 0 to 5) for a quick assessment of dataset reliability. 

    Here are the key dimensions used to evaluate dataset trustworthiness, along with examples: 

    • Validity – Measures the proportion of values that conform to expected data types and formats. The expected data types and formats are obtained using built-in semantic types. For example, a valid email address must contain "@" and a domain. 
       
    • Completeness – Assesses the proportion of non-empty values within a dataset. For instance, missing primary contact information in a customer database could impact outreach efforts. 
       
    • Discoverability – Evaluates how well a dataset is documented and incorporated into operational workflows. This includes checking whether the dataset has meaningful tags and descriptions and whether it is included in an activated data product. For example, a sales dataset with appropriate tags, descriptions and included in an activated data product would score higher than an undocumented, siloed dataset. This is because an activated data product in the data marketplace that includes the dataset would enhance visibility and accessibility for data consumers, increasing overall value.  
       
      So, how can you improve dataset discoverability? One quick way is by adding clear descriptions. With generative AI-based dataset description capabilities in Qlik Talend Cloud, you can effortlessly generate precise dataset descriptions, making them easier to find and understand. 

    • Usage – Measures dataset utilization across dependencies such as analytics apps, and the number of times assets using the dataset has been viewed. This dimension reflects the dataset’s true importance and its relevance to specific use-cases.  For example, a financial dataset embedded in an executive dashboard and accessed frequently by decision-makers would score higher than an isolated dataset with little visibility. 

     

    Overview of Qlik Trust Score for the Shipping Route Dataset from SnowflakeOverview of Qlik Trust Score for the Shipping Route Dataset from Snowflake

     

    By evaluating datasets across multiple dimensions, Qlik Trust Score provides organizations with a clear, actionable view of data quality and reliability. To optimize performance and flexibility, it supports two processing methods. Pushdown processing, available exclusively for Snowflake datasets, triggers quality computations directly within Snowflake. This approach ensures efficient, in-data warehouse processing without data movement. Meanwhile, pull-up processing, available for all datasets, enables quality computations within Qlik Talend Cloud, enabling broader data quality assessments without relying on external processing resources. 

    Key Benefits of Qlik Trust Score 

    • Customizable Weighting – Organizations can tailor the Qlik Trust Score to align with their specific data priorities. By adjusting the percentage weights of each dimension, teams can emphasize what matters most—such as validity for regulatory compliance or discoverability for analytics adoption. Non-mandatory metrics (such as usage or discoverability) can be disabled, ensuring the score accurately reflects business needs. 

     Tunable dimension weights to align with organizational specific data quality prioritiesTunable dimension weights to align with organizational specific data quality priorities

    • Actionable InsightsQlik Trust Score goes beyond a single number by offering detailed insights into data quality. Using the “Data Preview” tab, users can drill down at a column level, such as invalid values or missing metadata, to pinpoint issues quickly. 

    • Transparency & Collaboration – Trust in data improves when teams can see where the data is coming from and how fresh the data source is. Dataset lineage provides this information, fostering an open culture of transparency and collaboration across data and analytics teams. Dataset freshness let's you see when the data source was last updated. 

     

    Conclusion 

    Qlik Trust Score is more than just a metric—it’s a powerful tool for building confidence and enhancing data trust. With customizable scoring, organizations can tailor data quality dimensions to align with their data priorities, focusing on the factors that matter most. 

    Available in Qlik Talend Cloud Enterprise Edition, Qlik Trust Score delivers robust, reliable data quality insights. For more details, visit the Qlik Trust Score documentation. 

    Show Less
  • Image Not found
    blog

    Support Updates

    Upgrade advisory for Qlik Sense on-premise November 2024: Required add-on upgrad...

    Hello Qlik Admins and Developers, The next major Qlik Sense Enterprise on Windows release is scheduled for November 2024. The update will introduce ch... Show More

    Hello Qlik Admins and Developers,

    The next major Qlik Sense Enterprise on Windows release is scheduled for November 2024. The update will introduce changes that will have an impact on the following add-ons:

    • Qlik Alerting (link)
    • Qlik Sense Mobile client-managed (link)
    • qlik-cli (link) Upgrade to v.2.25.0 or higher
    • qlik/api (link) Upgrade to v1.12.0 or higher
    • Qlik Sense .NET SDK (link) Upgrade to v16.8.0 or higher

    The changes affecting the add-ons are:

    • Extended CSRF protection to Websocket requests
    • Adding support for CSRF to add-on products

    New versions of all affected add-ons were made available before or in November of 2024.

    Please plan your upgrade accordingly to prevent interruptions:

    If you upgrade to Qlik Sense Enterprise on Windows November 2024, all listed add-ons must be upgraded as well. 

     

    Thank you for choosing Qlik,
    Qlik Support

     

    Show Less
  • Image Not found
    blog

    Qlik Academic Program

    DHBW Mannheim & Qlik: Empowering Future Leaders with Data Literacy

    Imagine a world where data isn’t just numbers but a powerful tool for innovation. That’s exactly what DHBW Mannheim, the largest university in Baden-W... Show More

    Imagine a world where data isn’t just numbers but a powerful tool for innovation. That’s exactly what DHBW Mannheim, the largest university in Baden-Württemberg, is achieving with the Qlik Academic Program!

    In its Digital Commerce Management course, students don’t just learn about data—they experience it firsthand. Teaching data without hands-on practice? That’s like learning to drive without ever hitting the road!

    By integrating Qlik Sense, students gain real-world skills in data management, visualization, and analysis, preparing them for a data-driven workforce. They tackle real datasets, build dynamic dashboards, and explore how data drives decisions in retail and services.

    Why This Matters?

    Industries are hungry for data-literate graduates who can analyze trends, optimize strategies, and innovate. Thanks to the Qlik Academic Program, DHBW students gain valuable qualifications that give them a competitive edge.

    With Qlik Solutions Architect Lukas Lohmann’s support, students dive into hands-on projects—from streaming service comparisons to market trend analysis—gaining confidence in their data skills.

    Ready to Learn More?

    📖 Read the full success story here:
    👉 German
    👉 English

    🌟 Explore the Qlik Academic Program and how we’re transforming education: Qlik Academic Program

    📩 Questions? Feel free to reach out at eliz.cayirli@qlik.com

    Stay tuned for exciting updates!

    Show Less
  • qlik-productblogs.jpg
    blog

    Product Innovation

    Dynamic Engine: The New Standard for Distributed Data Processing in Qlik Talend

    As the needs for data management evolve rapidly and the demand for large-scale processing increases, Qlik takes a bold step forward with the release o... Show More

    As the needs for data management evolve rapidly and the demand for large-scale processing increases, Qlik takes a bold step forward with the release of its groundbreaking product: Dynamic Engine.

    This processing engine, designed to integrate natively with Kubernetes (K8s), redefines the data processing architecture, offering a unified, scalable, and future-ready solution.

    In this blog, we will explore the key features of Dynamic Engine and what sets it apart from existing processing solutions within the Qlik Talend Data Fabric.

     

    Unifying and Scaling Data Processing at Large

     

    Dynamic Engine introduces a unified processing platform that adapts seamlessly to any workload, whether deployed in on-premise, hybrid, or SaaS environments. This flexibility makes it the solution of choice for enterprises looking to migrate or manage their data pipelines across diverse infrastructure setups.

     

    SimonSwan_0-1738682569177.png

     

    Here are the key advantages of Dynamic Engine:

    1. Natively Integrated with Kubernetes (K8s): By leveraging Kubernetes, Dynamic Engine processes jobs within pods, offering nearly unlimited horizontal scalability. Each project is isolated within its own environment (namespace-based), ensuring better resource management and enhanced security.
    2. Distributed Processing at Scale: Unlike traditional Talend engines, Dynamic Engine enables the deployment of Kubernetes clusters to execute tasks in parallel, maximizing performance for batch processing and real-time data flows.
    3. Seamless Migration from Remote Engines: For existing Remote Engine users, the migration to Dynamic Engine is designed to be smooth and painless. Dynamic Engine becomes the new standard, ensuring a seamless transition while benefiting from enhanced features.
    4. Unified Data Workflows: One of Dynamic Engine’s main goals is to unify data processing workflows across the entire Qlik Talend suite, whether it’s integration, transformation, or data quality management. The engine standardizes and automates these workflows, regardless of data sources or destinations.

     

    Dynamic Engine enables businesses to orchestrate data integration tasks on customer-controlled infrastructure while benefiting from cloud-managed services. But what sets it apart from traditional engines like Talend’s Remote Engine?

     

    Built-in Version Upgrade Mechanism

    Dynamic Engine simplifies the process of keeping your environment up to date with the latest versions. With a built-in version upgrade mechanism, users will receive notifications via the TMC whenever an update is available. The versioning system ensures consistency between the Dynamic Engine and its related Dynamic Engine Environments, which makes managing upgrades across different environments a straightforward process.

    This mechanism allows for easier updates, ensuring that users can always benefit from new features, security patches, and performance improvements without the manual effort typically required in version management.

     

    Smooth Migration from Remote Engine

    For those currently using Qlik Talend’s Remote Engine, the transition to Dynamic Engine is made seamless through TMC’s promotion-based migration path. This migration is designed to be as frictionless as possible, leveraging existing APIs and known workflows.

    With a few steps, users can promote their existing Remote Engine setups to Dynamic Engine configurations, preserving the familiarity of the existing environment while taking advantage of the added flexibility and cloud-native capabilities of Dynamic Engine.

     

    Scalability via Run Profiles

    Dynamic Engine’s ability to scale dynamically is one of its strongest features. Using TMC’s Run Profiles, organizations can define how their data tasks are distributed across resources.

    This level of customization provides businesses with the flexibility to optimize their resources, improve performance, and reduce costs—all directly managed through TMC.

     

    Compatibility with Leading Cloud Providers and On-Prem Infrastructure

    Dynamic Engine is designed to work across various cloud and on-prem infrastructures, making it a versatile choice for enterprises. It is currently compatible with:

    • AWS EKS (Elastic Kubernetes Service)
    • Azure AKS (Azure Kubernetes Service)
    • On-premise environments like RKE2 and K3S

    In the near future, compatibility will extend to Google GKE (Google Kubernetes Engine) and OpenShift, ensuring that Dynamic Engine can meet the needs of organizations across different platforms. This flexibility allows businesses to maintain a hybrid approach to cloud and on-prem infrastructure, aligning with their specific requirements.

     

    Comparison with Remote Engines

     

    Historically, Talend’s solutions relied on Remote Engines to execute jobs outside of the Cloud. These engines allowed enterprises to maintain control over their infrastructure while utilizing local processing capabilities. However, as scalability and flexibility demands grew, these engines faced some limitations. Dynamic Engine, on the other hand, positions itself as a modern and automated solution.

     

    Remote Engine Dynamic Engine

    Limited Scalability: Remote Engines were constrained by the capacity of the machines they ran on. For large workloads, the infrastructure had to be manually adjusted constantly.

    Automatic scalability powered by Kubernetes.

    Fragmented Data Flows: Although highly performant, Remote Engines required specific configurations for each type of processing, leading to fragmented workflows.

    More fluid orchestration of data flows, centrally controlled via the Talend Management Console (TMC).

    Manual Environment Management: Each Remote Engine required a high level of manual management for scaling and resource optimization.

    Optimized resource utilization through Kubernetes pods, which can be easily provisioned and managed dynamically.

     

     

    How It Operates and Deploys Across Different Environments

     

    Dynamic Engine is designed to be deployed seamlessly in various environments. Here’s a closer look at its operation:

    1. On-Premise (with Cloud Orchestration via TMC): Although Dynamic Engine can be deployed on a Kubernetes cluster located on an on-premise infrastructure, its management and orchestration are still handled through the Talend Management Console (TMC) in the Cloud. This means that enterprises with local infrastructures can leverage their existing resources while benefiting from Cloud flexibility and agility for job management, performance monitoring, and workload scaling. This hybrid approach allows businesses to retain control over their physical resources while centralizing environment management through the Cloud.
    2. Hybrid or SaaS (via Kube as a Service approach): For enterprises operating in hybrid or fully Cloud-based environments, Dynamic Engine integrates seamlessly via platforms such as Qlik Cloud or directly through TMC. This flexibility allows Kubernetes clusters to be deployed in the Cloud and jobs to be managed at scale without requiring local infrastructure, making deployment simpler and faster.
    3. Scalability through Kubernetes: One of Dynamic Engine’s major strengths is its ability to dynamically provision Kubernetes nodes based on processing needs. This ensures that data pipelines can automatically scale with workload variations, maximizing resource efficiency and reducing processing times.

    Full documentation can be found here.

     

    Conclusion: A Future-Ready Engine

     

    With Dynamic Engine, Qlik offers a solution that not only addresses today’s challenges of large-scale data processing but also sets the standard for future data management needs. Whether enterprises are looking to scale their processing capacity, unify their data workflows, or automate environment management, Dynamic Engine stands out as the solution of choice.

    Together with Talend Data Fabric, Dynamic Engine creates a complete ecosystem that transforms how data is integrated, processed, and leveraged across the organization.

    Show Less
  • qlik-productblogs.jpg
    blog

    Explore Qlik Gallery

    Financiamento Apartamento e Despesas

    Financiamento Apartamento e DespesasPersonalApplication for financial organizationDiscoveriesSet AnalysisImpactOrganizationAudienceAdministratorData a... Show More
    Show Less
  • Image Not found
    blog

    Product Innovation

    Creating a write back solution with Qlik Cloud is now possible!

    Static, read-only dashboards are a thing of the past compared to what's possible now in Qlik Cloud. ‘Write back’ solutions offer the ability to input... Show More

    Static, read-only dashboards are a thing of the past compared to what's possible now in Qlik Cloud.

    ‘Write back’ solutions offer the ability to input data or update dimensions in source systems, such as databases or CRMs, all while staying within a Qlik Sense app.

    The solution incorporates both Qlik Cloud and Application Automation to enable users to input data from a dashboard or application and run the appropriate data refresh across the source system as well as the analytics.

    Example Use Cases:

    1. Ticket/ Incident Creation
      • Create Ticket or Incident in JIRA or ServiceNow.
    2. Data Changes
      • Update Deals/Accounts in a CRM like HubSpot or Salesforce.
    3. Data Annotations
      • Add a comment to or more records in a source system. 

    This new feature is possible with all of the connectors located in Application Automation, including:

    • CRMs like HubSpot or Salesforce
    • Databases like Snowflake, Databricks, Google Bigquery
    • SharePoint
    • and more!

    Below you can see technical diagram based around using Application Automation for a write back solution.

    Diagram.png

    The ability to write back in Qlik Cloud is a game changer for customers who want to operationalize their existing Qlik Sense applications to enhance decision making right inside an app where the analytics live. This not only streamlines business processes across an ever-growing data landscape, but it also enables users to to act in the moment. With Application Automation powering the write back executions, customers can unlock more value across their data and analytics environment.

    To learn more for a more ‘hands-on tutorial’ please see video here.

    Show Less