Do not input private or sensitive data. View Qlik Privacy & Cookie Policy.
Skip to main content

Unlock a world of possibilities! Login now and discover the exclusive benefits awaiting you.

Qlik

  • Qlik.com
  • Qlik Services
  • Qlik Help

Qlik.com | Qlik Services | Qlik Help
cancel
Turn on suggestions
Auto-suggest helps you quickly narrow down your search results by suggesting possible matches as you type.
Showing results for 
Search instead for 
Did you mean: 
Ask a Question
HelpSign In / Register
cancel
Turn on suggestions
Auto-suggest helps you quickly narrow down your search results by suggesting possible matches as you type.
Showing results for 
Search instead for 
Did you mean: 

Business

Announcements
Qlik and ServiceNow Partner to Bring Trusted Enterprise Context into AI-Powered Workflows. Learn More!
  • Qlik Community
  • :
  • Blogs
  • :
  • Business
Options
  • Subscribe
cancel
Turn on suggestions
Auto-suggest helps you quickly narrow down your search results by suggesting possible matches as you type.
Showing results for 
Search instead for 
Did you mean: 

Analytics & AI

Forums for Qlik Analytic solutions. Ask questions, join discussions, find solutions, and access documentation and resources.

read more

Data Integration & Quality

Forums for Qlik Data Integration solutions. Ask questions, join discussions, find solutions, and access documentation and resources

read more

Explore Qlik Gallery

Qlik Gallery is meant to encourage Qlikkies everywhere to share their progress – from a first Qlik app – to a favorite Qlik app – and everything in-between.

read more

Support

Chat with us, search Knowledge, open a Qlik or Talend Case, read the latest Updates Blog, find Release Notes, and learn about our Programs.

read more

Events

Learn about upcoming Qlik related events, webinars and local meetups.

read more

Groups

Join a Group that is right for you and get more out of your collaborations. Some groups are closed. Closed Groups require approval to view and participate.

read more

Qlik Community

Get started on Qlik Community, find How-To documents, and join general non-product related discussions.

read more

Blogs

This space offers a variety of blogs, all written by Qlik employees. Product and non product related.

read more

Qlik Resources

Direct links to other resources within the Qlik ecosystem. We suggest you bookmark this page.

read more

Qlik Academic Program

Qlik gives qualified university students, educators, and researchers free Qlik software and resources to prepare students for the data-driven workplace.

read more

Community Sitemap

Here you will find a list of all the Qlik Community forums.

read more

Recent Blog Posts

  • Image Not found
    blog

    Qlik Academic Program

    Support from Educators Leads to Career Development

    Survey findings reveal that educators who engage more deeply with their students significantly contribute to their career development. Students often ... Show More
    Survey findings reveal that educators who engage more deeply with their students significantly contribute to their career development. Students often seek guidance about life after graduation—questions like, "What will I be doing? Where will I be? Will I be happy in my career choice?"—are common. While educators may only have limited time with students in the classroom, stepping into a mentoring role can offer profound benefits for their future success.
     
    Read what students have to say about the importance of professor involvement in their career journeys:
    Half of College Students Say Professors Should Be More Involved in Career Planning
     
    Qlik's academic program equips educators with the necessary content, curriculum, and resources to provide the guidance students need. By fostering mentorship, educators can help students navigate their career paths and make informed decisions for their futures. To learn more about how you can get involved, please visit our academic page, www.qlik.com/academicprogram
    Show Less
    BrittanyFournier
  • qlik-nontechnicalblogs.jpg
    blog

    Qlik Academic Program

    Embracing Diverse Learning Styles in Academia

    Learning can be a challenge for many, while others may find it comes more easily. In our quest to discover the best approaches for individual learnin... Show More

    Learning can be a challenge for many, while others may find it comes more easily. In our quest to discover the best approaches for individual learning, we recognize the importance of flexibility. By structuring our days to align with our natural rhythms and seeking inspiration when it strikes, we can absorb information more effectively—even if our methods differ from the norm.

    To explore strategies for thriving with ADHD in academia, check out the full article here: Prevailing with ADHD in Academia.

    At our Academic Program, we provide guidance and resources tailored for both students and educators. With various ways to learn Qlik at your own pace, you can gain valuable insights while developing your skills. Whether you need structured lessons or a more flexible approach, we’re here to support your learning journey!

    To learn about how you can access free resources in data analytics if you are a student or professor, visit, www.qlik.com/academicprogram

    Show Less
    BrittanyFournier
  • qlik-nontechnicalblogs.jpg
    blog

    Explore Qlik Gallery

    Opportunity Dashboard

    Reason   Landing Page   Win Vs Loss   Opportunity Dashboard GainInsights Solutions This dashboard was one of my first attempts in QlikSense. It ... Show More
    ReasonReason

     

    Landing PageLanding Page

     

    Win Vs LossWin Vs Loss

     

    Opportunity Dashboard
    GainInsights Solutions
    This dashboard was one of my first attempts in QlikSense. It is a simple dashboard which gives insights over deals won and deals lost. It has competitor details and the reason why the deal was lost. It basically gives an idea about the strength of the organization and where it loses, to whom it loses and the reason for the loss.
    ReasonReason

    Discoveries

    It is a dashboard which will give you a quick insight on deals won and deals lost.

    Impact

    This mainly helps the organization to see why they lost and to whom they lost and for a what reason. For example, 20% of their loss might be to one organization for one particular reason (the reason can be cost). So, the next time when they compete with an organization, they can strategize well to win deals.

    Audience

    Sales team of any organization

    Data and advanced analytics

    This mainly helps the organization to see why they lost and to whom they lost and for a what reason. For example, 20% of their loss might be to one organization for one particular reason (the reason can be cost). So, the next time when they compete with an organization, they can strategize well to win deals.

    Show Less
    SenthilQlik
  • Image Not found
    blog

    Support Updates

    Qlik Data Transfer November 2022 expires on the 28th of November 2024

    Updated November 14th 15:00 CET: Download link.Updated November 15th 15:40 CET: Release Notes. Hello Qlik Cloud Administrators, The current version of... Show More

    Updated November 14th 15:00 CET: Download link.
    Updated November 15th 15:40 CET: Release Notes.

    Hello Qlik Cloud Administrators,

    The current version of Qlik Data Transfer (November 2022) will expire on the 28th of November 2024.

    Your log files may already show the following information:

    ABOUT TO EXPIRE: This engine is about to expire. Please upgrade to a newer version! Expiry date: 2024/11/28

    Qlik Data Transfer will stop functioning after this date. 

    How do I prevent downtime?

    A new version of Qlik Data Transfer has been released (14th of November), which will guarantee continued functionality and support until its end-of-support date. The November 2024 release is available on the Download page and its release notes can be accessed in Qlik DataTransfer Release Notes - November 2024.

    Qlik Data Gateway - Direct Access

    Whenever possible, Qlik recommends using Qlik Data Gateway - Direct Access to load data from on-premise into a Qlik Cloud tenant. The supported databases, the Generic ODBC Connector Package, together with the upcoming support for loading on-premise files, can allow you to decommission Qlik DataTransfer servers (and potentially repurpose them for gateways).

    For more information, see Qlik Data Gateway - Direct Access.

     

    Thank you for choosing Qlik,
    Qlik Support

    Show Less
    Sonja_Bauernfeind
  • qlik-nontechnicalblogs.jpg
    blog

    Design

    Monitor Visualizations in the Cloud Hub

    Did you know that you can monitor visualizations in the cloud hub? Visualizations from sheets in an app and from Insight Advisor can be selected for m... Show More

    Did you know that you can monitor visualizations in the cloud hub? Visualizations from sheets in an app and from Insight Advisor can be selected for monitoring. Once selected, these visualizations will appear in the cloud hub in the Your charts section on the Home page and under Charts on the Explore page. Charts can be saved with or without selections and are refreshed every time the source app reloads. Let’s look at how it is done.

    In the image below, the bar chart is on a sheet in my app. Right click on the chart and then click on the eclipse (…) to see the Monitor in hub option displayed below. In the Insight Advisor, the eclipse is in the top right of the chart. Any selections applied to the chart will be saved and applied to the chart that you are monitoring.

    monitor.png

    What is nice about this is you can keep an eye on a specific chart without having to open the app and navigate to the visualization. The latest reloaded version of the chart is always the one that is visible in your cloud hub as seen below.

    view charts.png

    If you want to look at the chart in the app, that is easy to do. When you hover over the chart, there are 2 options: View chart and View in app. View chart will open the latest reloaded version of the chart. Selections cannot be made from here, but they can be made if View in the app is selected. View in the app will open the chart in the app or in Insight Advisor depending on where the source chart was selected. From Home or Explore, you can also click on the eclipse in the lower right of the chart to:

    • See details about the chart such as the reload history
    • Add the chart to a collection
    • Edit some properties of the chart
    • Delete the chart from monitoring

    eclipse.png

    Another nice feature is you can compare versions of the chart. To do this, select the View chart option. From here, you can not only see the latest version of the chart you are monitoring but you can view previous versions of the chart. You can also compare 2 versions of the chart. When you hover over a chart in the Chart history section that is not selected, you will see a 1 and/or 2 in the right top corner of the chart. Selecting 1 will move the chart to the left side of the selected chart for comparison. Selecting 2 will move the chart to the right side of the selected chart for comparison. In the image below, the latest reloaded chart is selected and when I hover over the previous chart, I can see the 2 in the upper right that I can select if I want to compare it to the selected visualization.

    compare.png

    Monitoring your charts in your personal space in the cloud hub gives you a new level of flexibility and allows you to see what visualizations are important to you quickly with options to explore them further, if needed. Check out  Michael Tarallo's Qlik Sense in 60 - Chart Monitoring video to quickly see how it is done and see Qlik Help for more information on monitoring your visualizations in the cloud hub.

    Thanks!

    Show Less
    Jennell_Yorkman
  • Image Not found
    blog

    Design

    Toggle Sheet Header and Toolbar On and Off

    There have been many new capabilities that give developers ways to customize and style an app. In this blog, I will review how the sheet header and to... Show More

    There have been many new capabilities that give developers ways to customize and style an app. In this blog, I will review how the sheet header and toolbar can be toggled on and off and the benefits of each, as well as things to consider. The sheet header and the toolbar both appear at the top of an app. The sheet header, outlined below in yellow, includes the name of the sheet, an optional logo or image, and previous and next sheet navigation arrows.

    header and toolbar - sheet header.png

    The toolbar is the row above the sheet header. It includes buttons and links to Notes, Insight Advisor, selections tool, bookmarks, sheets and edit sheet.

    header and toolbar - toolbar.png

    The toggle for the sheet header and toolbar can be found in the app options section of an app. Open app options by clicking on the arrow next to the app name at the top center of the app. From there, click on the App options icon on the right.

    app options.png

     Once the app options are open, you will find the toggles for Show toolbar and Show sheet header.

    open app options.png

    One of the main benefits of removing the sheet header and toolbar is to gain more space on the sheet. The space that is used by the sheet header and toolbar become area that developers can use for additional filter panes and/or visualizations. Another benefit is developers can add custom capabilities to replace the Qlik Sense defaults. For example, a developer may want to create their own navigation buttons and have more control over the options that are available to the user. If the sheet(s) are being used to create a PowerPoint presentation, removing the sheet header and toolbar makes the presentation look more polished.

    Now let’s discuss some things to consider when removing the sheet header. If the sheet header is removed, alternative sheet navigation should be provided for the user. It is possible to use your keyboard to navigate the sheets, but many people do not know that so custom navigation should be created by the developer using buttons or links. In the image below, buttons are used.

    custom navig.png

    In the image below, buttons are used again but the highlighted button indicates the sheet the user is on. So, in this example, the developer has replaced the sheet navigation and the sheet title that was included in the removed sheet header. 

    button navig.png

    A sheet title can also be added to a sheet using a Text & image object. The custom navigation can be designed to match a theme or company brand which gives the developer a lot of flexibility and can give a company’s apps a consistence look and feel.

    When the toolbar is toggled off, features are hidden but they are not removed from the app entirely. This is great but not all users may be aware of alternative ways to access the features on the toolbar, so it is important to keep this in mind. For example, users can still create notes for a visualization or view notes for a visualization by right-clicking on a chart, selecting the eclipse (…) and then selecting Notes. Another example is users can still access bookmarks or the sheets in an app via the App Overview. Users can still ask questions via Insight Advisor, so not functionality is loss with the removal of the toolbar. Other things to consider is that while selections can still be made via filter panes and visualizations, without the selection bar, users may not be aware that selections have been made. This is why the developer needs to make sure there are filter panes or some way for users to know what has been selected. When it comes to selections, buttons can also be used to perform actions such as clearing selections and making selections in a field.

    The overall goal is not to make things harder for the user so knowing possible issues and designing for them is smart. While there are benefits in toggling off the sheet header and/or toolbar, developers must consider how this may impact their users and how their users will use the app. The user experience can be just as good with the sheet header and toolbar toggled off if the developer plans well for an intuitive user experience.

    Thanks,

    Jennell

     

     

     

     

    Show Less
    Jennell_Yorkman
  • qlik-nontechnicalblogs.jpg
    blog

    Explore Qlik Gallery

    Holiday Dashboard

    Holiday DashboardBaker Tilly N.V.The purpose of this dashboard is to highlight the customization and interactivity capabilities of Qlik through a holi... Show More
    Holiday Dashboard
    Baker Tilly N.V.
    The purpose of this dashboard is to highlight the customization and interactivity capabilities of Qlik through a holiday-themed travel experience. Users can explore their past trips, get useful information about their travel destinations, and organize their future travel plans by managing a personalized wishlist. The dashboard allows users to visualize historical travel data, including destinations, trip durations, and seasonal trends, while offering real-time insights into upcoming destinations based on factors like weather, costs, and available activities. This interactive tool is designed for travel enthusiasts, frequent travelers, and anyone looking to make informed decisions about their trips. It demonstrates how Qlik’s powerful analytics can transform travel data into valuable insights, empowering users to better plan and reflect on their journeys. The dashboard showcases the flexibility of Qlik Sense in creating engaging and dynamic user experiences, making it an invaluable tool for personalized decision-making and future travel planning.

    Discoveries

    The purpose of this dashboard is to highlight the customization and interactivity capabilities of Qlik Sense through a holiday-themed travel experience. Users can explore their past trips, get useful information about their travel destinations, and organize their future travel plans by managing a personalized wishlist.

    Impact

    The dashboard allows users to visualize historical travel data, including destinations, trip durations, and seasonal trends, while offering real-time insights into upcoming destinations based on factors like weather, costs, and available activities.

    Audience

    This interactive tool is designed for travel enthusiasts, frequent travelers, and anyone looking to make informed decisions about their trips. In particular, BI analysts.

    Data and advanced analytics

    It demonstrates how Qlik Sense’s powerful analytics can transform travel data into valuable insights, empowering users to better plan and reflect on their journeys. The dashboard showcases the flexibility of Qlik Sense in creating engaging and dynamic user experiences, making it an invaluable tool for personalized decision-making and future travel planning.

    Show Less
    SterreKapteijns
  • Image Not found
    blog

    Design

    Orchestrating Talend Jobs

    This blog post will explore three different options for orchestrating Talend Jobs in the Qlik Platform. Execution Plans – orchestrate Tasks running i... Show More

    This blog post will explore three different options for orchestrating Talend Jobs in the Qlik Platform.

    • Execution Plans – orchestrate Tasks running in different engines from the TMC
    • tRunJob – orchestrate child jobs from a parent job all running in the same JVM
    • TMC API – orchestrate child Tasks running in different engines from a parent Task

    Execution Plans

    Talend Jobs are designed in Studio and ultimately published to Talend Cloud where the Job Artifacts are configured and scheduled as Tasks.  One way of orchestrating Tasks is to use Execution Plans.  Execution Plans allow you to define a series of sequential steps.  Each step can run one or more Tasks in parallel.

    All job execution and configuration is delegated to the Task.  So which Remote Engine or Remote Engine Cluster is used to run the Task is specified in the Task.  The same is true for Task parameters which are ultimately mapped to Context Variables in the Job.

    Since Task parameters are static, Execution Plans are also static.  Their parameters can only be specified a single time during Task definition.  This may be sufficient for some Jobs where for example, a network folder is specified which is then scanned for files on a schedule.  But many times a Task would benefit from more dynamic parameters.  For example, a Job might receive a url for the location of a file to be processed.  This url could vary based on the context of the orchestration.  In this case Execution Plans do not work.

    Execution Plans are also static in the sense that there is no conditional or looping logic.  So a Task cannot be executed in response to any conditions or the output of prior steps.  Nor can it be executed a variable number of times.

    Execution Plans are also limited because they have no Error Handlers, which is really a form of conditional processing.  They do have the ability to Rerun the Plan, but this requires manual interaction in the TMC UI.  Programmatic control is possible with the TMC API for Plan Executions but at that point you might as well build it into the Job.

    Execution Plans do benefit from being able to run multiple tasks in parallel in a single step.  But this is also limited.  A frequent design strategy is to scale a process by running multiple instances of the same Task in parallel to spread the workload across multiple servers.  Assuming that the data can be partitioned, e.g. by folders, files, or some partitioning key(s) in a database table, multiple instances of a Task can be configured to run concurrently.  The degree of concurrency can be managed at the Remote Engine level.

    The limitation here is that although Tasks can be run concurrently on the same Remote Engine, Execution Plans do not allow multiple instances of the same Task in the same Step.  So running parallel instances of a Task is limited to use by the TMC API.

    From an SDLC perspective, Execution Plans make integration testing a bit more difficult since they involve running the Execution Plan in the operations environment (the TMC) rather than the development environment (Studio).  Integration Test automation can overcome this by using the TMC API to launch the integration, but debugging can still be cumbersome.

    With that said, Execution Plans still offer some benefits.  They provide a simple user-friendly interface for an operations team.  This supports separation of duties between operations and development for simple static orchestration.

    Finally, Execution Plans run inside the TMC control plane in the Qlik Cloud.  Therefore, any orchestration metadata is passing outside of the customer’s network.  But since Execution Plans are so simple and so statically constrained, there is no risk of customer data being exposed.  After all, orchestration is completely static and configuration is completely delegated to the child Tasks.

    The table below summarizes the advantages and limitations of Execution Plans.

    Advantages

    Limitations

    •       Modular Composition

    •       Simple browser UI in TMC

    •       Separation of Duties

     

    •       Static specification of Tasks

    •       Static configuration of Tasks

    •       Cannot execute the same Task concurrently

    •       Limited orchestration semantics

    •       Multiple development contexts (IDE and browser)

    •       More difficult to create integration tests

     

    tRunJob

    The tRunJob component in Talend Studio is used to run a other “child” jobs from a “parent” job.  This promotes modularity and re-use.  Since both the parent and child jobs are designed in Studio, they are also easy to test.  Child jobs can be tested individually and integration tests can be done by running the parent.

    The main limitation of using tRunJob is that child jobs are run in the same JVM process as the parent.  This means there is no ability to scale horizontally.  However, multi-threading is possible by using the tParallelize component or enabling parallel iterators on components such as tFlowToIterate, tWaitforFile, or tLoop.

    Since tRunJob is a component used within a Job, it benefits from the conditional and control flow supported by all Talend Jobs. 

    While flexible, it must be emphasized that child jobs are specified at design time and hence are static.  A parent job can use conditional flow to select the appropriate child job for a particular stage of processing provided that all variations of child jobs at a certain step are known at design time.  But when the variation is unknown at design time, or extensions may need to be deployed without rebuilding the job,  or there are too many variants then the static nature of tRunJob can be a problem.

    Unlike Execution Plans, tRunJob can also specify the parameters for the child job.  These can even be derived from the output(s) of previous steps in the parent job.  If a parent job wants to simply pass all of its context variables to the child this can be done as well.  While not as disciplined as formally observing the contract with the child, this can be convenient.

    Child jobs can also return a dataset using the tBufferOutput component.  The outbound datastream is returned by the child job and each record can be processed by the parent job.  This allows the output of a child job to be iterated over as a collection by a parent job.  In some cases each output row may be an input parameter to other child jobs.  This is usually done using tFlowToIterate on the output of the first child job and then using a second tRunJob in the iterator.

    Another advantage of using tRunJob is that re-use of the child job is done at the design stage, so it does not require publication of the child job to the TMC, since the child job is incorporated into any parent jobs that use it.  The result is less clutter in the TMC.  Of course, the flip side of this is that parent jobs are monolithic and the child job may be redundantly embedded in many parent jobs. 

    Talend Studio manages dependencies for you, so changes to the child job can be viewed based on the Impact Analysis.  Whenever a parent job is run in the Studio it will detect any changes in the child job and rebuild it if necessary.  And if CI/CD is used per standard best practice then the Tasks corresponding to the parent job will also be rebuilt when the child job is modified.

    Unlike Execution Plans there is no separation of duties between development and operations because all orchestration is done at design time.  However, context variables can be used with control flow to externalize selection of child jobs subject to the static design time constraints above.

    Finally, orchestration using tRunJob takes place in the job itself, and as such it runs in the data plane.  This means that any configuration logic or passing of parameters to child jobs is staying within the customers network, so there is no chance of data leakage by running in the Cloud.

    The table below summarizes the advantages and limitations of tRunJob.

    Advantages

    Limitations

    •       Modular

    •       Easy to test

    •       Multi-threading support via tParallelize and iterators

    •       Flexible configuration of Context Parameters across Jobs

    •       Conditional and flow control

    •       No extra Task clutter in TMC

    •       Return datasets

    •       Runs in Data Plane

     

    •       Static specification of Child Jobs

    •       Does not scale horizontally

    •       Monolithic deployment

    •       Limited separation of duties

     

    TMC API

    The TMC API can be used via the tRESTclient to trigger Tasks in TMC on-demand.  This provides an effective means of orchestration based on well defined service contracts that promote modularity and re-use.  It also provides loose coupling.

    Using this approach scales horizontally across multiple servers using Remote Engine Cluster.  Unlike Execution Plans, the same Task can be run concurrently to scale-out partitioned workloads.

    Since the API is being used within a Studio job, all Studio conditional and control flows are available.

    Like tRunJob, parameters can be passed to the other child services via the API, and the context parameters can be derived dynamically from the output of previous steps.

    Unlike tRunJob, because of the loose coupling there is no design time limitation on specifying which child services to call.  Which child services are to be run can be specified via configuration by an operator, promoting separation of duties.  But they can also be controlled programmatically for truly extensible workflows.

    Since specification of orchestrated tasks is loosely coupled and can be dynamically configured, it can also be externalized for clear separation of duties with the operations team.

    Since the parent job doing the orchestration is running in the data plane itself, the only data passing outside the customers network is via the well defined interfaces of the TMC API.  Child service parameters can be passed as references to urls or other means of indirection so that no business data is sent to the Cloud.

    Testing is facilitated by the modular service orientation.  Integration tests are also easier to perform than with Execution Plans because the parent job is itself a job which can be debugged in Studio.  However, unlike with tRunJob the parent and child jobs are running in separate processes, and hence it is inherently not possible to debug both parent and child processes in the same environment.

    The table below summarizes the advantages and limitations of TMC API based approach.

    Advantages

    Limitations

    •       Modular

    •       Loose Coupling

    •       Horizontal scaling

    •       Conditional and flow control

    •       Flexible configuration of Context Parameters across Jobs

    •       Dynamically specify Tasks

    •       Extensible

    •       Separation of Duties

    •       Runs in Data Plane

    •       Somewhat more difficult to create integration tests

    •       More complex TMC API calls

     

    Summary

    Comparing the three options is clear that the TMC API approach provides better modularity, looser coupling, more dynamic and extensible behavior, and horizontal scalability.  The only limitation is that the jobs must make more complex TMC API calls.

    We will cover the TMC API in the next post, and in subsequent posts we will provide sample jobs that show how to encapsulate the complexity of the TMC API as re-usable child jobs.

     

    Show Less
    EdwardOst
  • Image Not found
    blog

    Qlik Digest

    Qlik Digest - November 2024

    Catchup On Recent Webinars   Lakehouses: Driving the Future of Data & AI Data prep is often a time-consuming hurdle, but it doesn’t have to be. In ou... Show More

    Catchup On Recent Webinars

    ElizabethKropp_1-1732202475749.png

     

    Lakehouses: Driving the Future of Data & AI

    Data prep is often a time-consuming hurdle, but it doesn’t have to be. In our latest webinar, we discussed how AI and automation can simplify data management, reduce manual work, and accelerate your path to insights. Whether you’re a data engineer or a senior leader, there’s something for everyone.

    Watch on-demand

    Crafting the perfect Report: Embracing Modern BI

    Missed our recent Reporting Webinar?  No Problem.  Be sure to catch up on-demand as we dive deep into a mission critical capability of Qlik Sense – reporting!

    Receive a stellar demonstration, AND a special announcement that will take your Qlik reporting from great to “perfect”.

    Watch on-demand

     

    Join Qlik  at AWS re:Invent 2024 – December 2-6

    ElizabethKropp_3-1732202614575.png

    Are you planning to attend AWS re:Invent 2024? If so, stop by our booth# 1928 to talk with our executives.

    We will have live demos, roadmap discussions, and a Qlik-Accenture joint presentation on Monday, December 2nd at 4pm PST.

    Learn more

     

    Register Now for Qlik Connect 2025

    ElizabethKropp_4-1732202634486.png

    Be our guest for three magical days of learning, networking, and inspiration — and discover new ways to maximize your data’s value!

    Qlik Connect is coming to Disney’s Coronado Springs Resort in Orlando, FL, and you won’t want to miss the inspiring keynotes, best practices from industry experts, product innovations, a sneak peek at the Qlik roadmap, free on-site certification opportunities, and more.

    Claim your $500 savings before it turns into pixie dust.

    Button: Register today!

     

    Qlik Luminary Applications Now Open!

    ElizabethKropp_5-1732202732477.png

    Are you passionate about Qlik and driving business transformations with data? If you've been a vocal advocate for Qlik throughout the year, we invite you to apply for the 2025 Qlik Luminary Program!

    Exclusive Benefits Await:

    • Education: Free access to Qlik training and product licenses.
    • Access: Engage with Qlik Executives, R&D, Product Management, and Customer Success teams.
    • Community: Join a private Qlik Luminary forum on Qlik Community.
    • Insight: Quarterly NDA briefings with Qlik executives.
    • Perks: VIP treatment, major discounts, and special activities at Qlik events.
    • Credibility: Earn digital badges and a featured profile on Qlik.com.

    Mark Your Calendars:

    • Applications Open: November 11th
    • Applications Close: December 16th

    Learn more and apply

     

    Exciting New File Connector Release

    ElizabethKropp_6-1732202861182.png

    The File Connector for the Data Gateway is being launched by the end of November. It provides a key capability to bridge on-premises file data to Qlik Cloud Analytics, facilitating migration and enabling hybrid architectures. It aims to simplify access to file-based data sources while organizations transition to cloud-based analytics. 

    Migration Value

    • Enable phased migration by maintaining access to on-premises files
    • Reduce migration barriers with familiar file access capabilities
    • Replace the Qlik Data Transfer tool (EOL 2025) with a more robust solution

    Key Benefits

    • Accelerate Cloud Migration: Easily access and leverage existing on-premises data, especially QVDs, in Qlik Cloud Analytics.
    • Seamless Data Access: Load firewalled data files directly into Qlik Cloud, supporting any file type handled by Qlik Engine.
    • Simplified Configuration: Utilize predefined connection definitions for quick setup.
    • Flexible File Selection: Take advantage of wildcard support for folders and files

    Features

    • Access network drives and file systems on the Gateway server
    • Preview capability for files (with some limitations for large files)
    • Read-only access to ensure data security

    Learn more:  File Connector via Direct Access Data Gateway | SaaS in 60 & Connector Factory Blog

     

     

     

    Show Less
    Elizabeth-Kropp
  • Image Not found
    blog

    Design

    Enhanced File Management

    A new file management system has been added to Qlik Cloud allowing users to create folders and subfolders for their data files. This hierarchical fold... Show More

    A new file management system has been added to Qlik Cloud allowing users to create folders and subfolders for their data files. This hierarchical folder structure is available in your personal space, as well as shared, managed and data spaces. Now, I can add folders to a data space in my tenant and use folders to organize my files. This is extremely helpful when I have a project that has a lot of data files.

    There are two ways to add folders to a space. The first is through Space details as seen in the image below.

    Space details.png

    After clicking on Space details, select Details. Then select Data files.

    data files.png

    From the screenshot below, I can use the icons at the top or click on the eclipse at the end of a folder row to:

    1. Upload files to a folder
    2. Add a folder
    3. Copy a folder
    4. Cut and paste a folder
    5. Delete a folder

    I can also use the icons and menu options to cut, copy and paste files and/or folders to somewhere else within the space or to another space. More than one file can be cut/copy/pasted by selecting the entire folder or by holding ctrl and selecting each file.

    files options.png

    The second way to add a folder to a space is via Administration > Content.

    content.png

    Let’s add one more folder to the CAJ space to see how it is done. To add a 2023 folder to the CAJ space, I will click on the Add folder icon and enter the name of the new folder. Be sure to confirm the path is correct. If not, select the correct path from the Path dropdown. Then click the Create button.

    add folder.pngOnce the folder is added, I can click on the eclipse for the 2023 folder and select Upload file to folder to upload files to this folder. If I used the Upload icon at the top, I would have to change the path to the 2023 folder before uploading my files.

    This enhancement also lets me create subfolders as seen in the image below. I have a Population folder in the Census folder.

    subfolders.png

    Now, let’s look at how we can load the data files in a Qlik Sense app using the folder structure. From the script editor in Qlik Sense, I can select the CAJ space and then click the Select data icon.

    connection.png

     

     

     

     

     

     

     

     

     

     

    I can see the folder structure I have set up and I can drill down into the folders to select the file I want to load.

    select file.png

     

     

     

     

     

     

     

     

     

     

     

    Notice that the file path in the script matches the hierarchical file structure I set up. This file structure can also be used when storing QVDs or inserting a QVS file.

     

    file path.png

     

     

    I love this new feature in Qlik Cloud. Sometimes I have the need to organize my files by their source or in the example I shared in this blog by the year of the project. This allows me to organize my data in a way that makes sense for development. To learn more about this enhanced file management feature, check out Qlik Help.

    Thanks,

    Jennell

     

    Show Less
    Jennell_Yorkman
  • Image Not found
    blog

    Product Innovation

    Exploring Talend Management Console API with Talend API Tester

    Talend Studio is used to visually design and build the job which is published to a repository hosted in the Qlik Talend Cloud.  Talend Management Cons... Show More

    Talend Studio is used to visually design and build the job which is published to a repository hosted in the Qlik Talend Cloud.  Talend Management Console is used to manage, schedule, configure, and monitor Tasks which run the Studio jobs on Remote Engines.

    But scheduling is not the only mechanism for running Tasks.  Sometimes it is desirable to trigger Tasks in response to an external event.  For example, a Task might be invoked whenever a file is dropped in a folder or perhaps in an S3 bucket.  Or maybe the Task needs to be triggered as part of a larger workflow in another application.

    In these cases, the TMC API can be used to flexibly invoke the Task.  The TMC API also has the advantage that it can pass parameters from the REST API call as a context variables to the underlying job represented by the Task configuration.  Using the TMC API is more flexible than a scheduled Task which must run a fixed configuration on a fixed schedule.

    The TMC API is quite powerful, and with it you can automate any task that you can do interactively via the TMC in the browser.  There are some excellent examples in the TMC API Use Cases documentation.  A very important use case is Using a Service Account to Run Tasks.  The example in the documentation assumes that you have a Service Account token and that you have the Task Id of the Task you want to execute.  In practice you need to make additional API calls to generate a Service Account Token and to get the Task Id given the human readable Task Name.

    In this blog post we will explore the basics of the TMC API using the Talend API Tester.  The Talend API Tester is a Chrome plugin that allows you to create and manage Requests against REST endpoints that have an Open Application Specification v3 (OAS v3) contract.  This was previously known as a Swagger contract.  OASv3 is a broadly accepted standard for expressing service contracts and is comparable but much simpler than the older WSDL for SOAP based service consumption.  The Talend API Tester is included as part of the Talend Cloud and is accessible from the Same Talend Cloud UI.

    We will make an initial API calls to retrieve the Environment and Workspace Id.  Then we will make a second API call to get the Task Id given the Task name.  Finally, we will invoke the Task using its Task Id.  Initially we will authenticate using a User Token.

    We will then extend this example to make some oAuth2 calls to retrieve a Service Account Token which will be used instead of the User Token.

    In a future blog post we will apply this technique to orchestrate multiple job Tasks from other Tasks to scale jobs horizontally.

    Pre-Requisites

    You will need an existing Environment and a Workspace for which you have Execute permission and View permissions.  If you are creating a new Job which you will publish to the Workspace you will also need the Publish permission.  In order for your user to be able to be granted these  permissions, they will need to have the Operator role.  If you do not have this role, ask your administrator to Assign this Role to you. 

    There must be a Remote Engine associated with the sample Workspace or Environment.

    You do not need Studio permissions if a job has already been published as a Task to TMC.  But if you are creating your own job in Studio then you will need the Integration Developer role.

    Rather than using your personal userid and password, you should create a Personal Access Token.

    A Service Account can be created by your Administrator or other privileged user.  If you are doing the Service Account examples, verify that your Service Account has access to the Target Workspace.  It needs the same  Execute permission and View permissions on the target Workspace mentioned earlier.

    Below is a screenshot of how your administrator can add Workspace permissions for your Service Account.  Initially the Service Accounts tab in the right-hand pane may be empty.  Clicking the Add Permissions button will allow the administrator to select the Service Accont and select the appropriate Execute and View permission.

    EdwardOst_0-1731945398630.png

    You will also need a sample job which has been published from Studio and configured as a Task in TMC that runs in the target Workspace.

    In order to the examples, you will need the API Tester role.  This will allow you to launch the API Tester from the TMC.

    EdwardOst_1-1731945418402.png

    Importing the TMC API Into API Tester

    The TMC API Reference is available at https://api.talend.com/apis/.  We will be using the Orchestration and Processing API’s.

    EdwardOst_2-1731945468057.png

    Extracting Orchestration Services

    Click on the Orchestration API.  It takes you to the OASv3 UI representation.  Scroll down and inspect the Workspaces -> List Workspace operation.  It is a simple Get operation.  Note the Query parameters that allow you to specify filter criteria.  It is self-explanatory with name referring to the Workspace name and environment.name referring to the Environment to which the Workspace belongs.

    Click on the Try in API Tester dropdown and select the region in which your TMC is located.  Now click on the Try in API Tester link.

    EdwardOst_3-1731945503999.png

    You will be taken to the API Tester where a new API Tester Project called Orchestration 2021-03 will be created.  There is a folder for each section of the API in the left-hand pane.

    EdwardOst_4-1731945518427.png

    Expand the Workspaces section folder and you will see four operations including the List Workspaces operation.  Select the List Workspaces operation and the right-hand pane will show a display a form for populating the request parameters based on the OASv3 specification.

    EdwardOst_5-1731945543182.png

    You can modify the input parameters including query parameters, headers, and the body to submit individual requests directly to the API.  However, we are going to create a series of requests as a Scenario. 

    Before we can create our new Scenario, we need to create a separate Project to store it in.  We will be using operations from multiple TMC Services, so it does not make sense to store our Scenario in the project of an individual Service.

    Click on the ellipsis context menu of the root My Drive folder and select Add a Project.

    EdwardOst_6-1731945563119.png

    A new project with an empty Scenario 1 will be created.

    EdwardOst_7-1731945578687.png

    Select Scenario 1 and rename it to Run Task by Name.

    EdwardOst_8-1731945597156.png

    Now return to the Orchestration project and click on the ellipsis and select Extract to Scenario.

    EdwardOst_9-1731945651669.png

    A dialog box is displayed for you to select which operations from the current project you want to include in the Scenario.  Select both the Tasks -> Get Task Executions and the Workspaces -> List Workspaces operation and click Extract.  The screenshot below only shows one of these because of the scrollbar but be sure to select both.

    EdwardOst_10-1731945666096.png

    Another dialog window in which Project you want to create the new Scenario.  Select the TMC API project you just created.

    EdwardOst_11-1731945683387.png

    The dialog box will update to show the new path to the selected project.  Now select the Run Task by Name scenario.

    EdwardOst_12-1731945702009.png

    The path is updated.  Finally click Save.

    The Get Available Task and the List Workspaces operations are copied into the new Run Task by Name scenario folder.

    EdwardOst_13-1731945724431.png

    Extracting the Processing Services

    We need one more API operation for our Scenario.  Return to the API documentation page and click on the Processing service.

    EdwardOst_14-1731945769704.png

    Select your TMC region endpoint and then click Try in API Tester.  It takes you back to API Tester and informs you that the new Project is named Processing 2021-03.

    EdwardOst_15-1731945785039.png

    Open the Processing 2021-03 project and click the ellipsis next to the Task Executions Service and select Extract to scenario.

    EdwardOst_16-1731945806163.png

    Since we selected just the Service rather than the whole Project we get a smaller list of Operations to export.  Select the Execute Task operation and click Extract To.

    EdwardOst_17-1731945819955.png

    Navigate to the TMC API project and select the Run Task by Name scenario as before and click Save.

    EdwardOst_18-1731945842096.png

    The Execute Task operation is added to the Scenario.  Select the Run Task by Name scenario in the left-hand pane and then click the Scenarios tab at top to switch to a more detailed perspective.

    EdwardOst_19-1731945868724.png

    All three Operation Requests were added to the scenario, but they are not in the correct order.  Use the up and down arrows on the right to change the order of the requests.  The order you want is

    1. List Workspace
    2. Get Available Tasks
    3. Execute Task

    Your Scenario is mostly ready but we need to populate the requests with details for authentication and specific parameters for your sample job.

    Setting Up the Environment

    When you imported your Orchestration and Processing Projects a default Environment was created for each Project.  The Environment can be used to store common key-value pair configurations.  The only such environment variable created by default was the BaseUrl property which points to the regional API endpoint, e.g. https://api.us.cloud.talend.com for the US region.  But we will be adding a few more environment variables for things like your authentication token.

    First, let’s copy our API project Environment into new Environment for our TMC API project.  Select the Run Task by Name scenario if it is not already selected in the left-hand pane and click on the pencil icon in the upper right representing the Environment settings. 

    EdwardOst_20-1731945930702.png

    You are taken to the Environments editor dialog window.  Click on Add an Environment.

    EdwardOst_21-1731945946449.png

    Enter TMC API as the name for the new environment to match the name of the Project you created.  Also, check the box marked “Copy variables from” and select one of the environments created for the API projects you just imported, either Orchestration 2021-03 or Processing 2021-03 as before.  Click the Create button to initialize your new environment with the same settings as those projects environments. 

    EdwardOst_22-1731945969766.png

    Your new environment is displayed and not surprisingly it starts off identical to the old environment.  In addition to the BaseUrl configuration we are going to add the user token created earlier.  We are going to create this as a Private variable.  This is important because it means it is private to you.  Other users using the Scenario will need to enter their own token.

    EdwardOst_23-1731945985210.png

    Name the new environment variable “tmc_token” and paste in the user token you created earlier.  Then click Close.

    EdwardOst_24-1731945999912.png

    Testing the Operations

    We are ready to test each of the individual requests in our Scenario.  As we progress, we will use the output of previous API calls as inputs to subsequent API calls.

    List Workspaces

    Start by testing the List Workspaces operation.  Select the List Workspaces operation from the Run Task by Name scenario in the left-hand pane to edit it.  In the right hand-pane add a request Header named Authorize. 

    Set the value of the Authorize Heder to “Bearer “.  Note the space at the end of that string.  After entering the Bearer prefix, click the little wand icon to the right of the edit box.

    EdwardOst_25-1731946047681.png

    The wand icon opens the Expression Builder dialog box which allows you to reference environment variables or the response body of previous operations in the scenario.

    Click on the tmc_token in the Environment variables section of the Expression Builder.

    EdwardOst_26-1731946061299.png

    The Expression built for you is displayed in the lower section as well as a concrete evaluation Preview of the Expression in the current context.  In the diagram the preview has been redacted since it is a sensitive value.  Click Insert to return to the request editor.

    In this case we just want an environment variable.  We could have just manually entered

    Bearer ${“tmc_token”}

    But we wanted to use the Expression Builder with this simple example so we will be ready for the more complex steps later.

    The result of the Expression Builder has been inserted into the value of the Authorize Header.

    EdwardOst_27-1731946080124.png

    Note that although there is a Query Parameter it has not been enabled by the checkbox.  As a result, the current request will return all Tasks in all Environments.

    Rather than overwhelm ourselves with a potentially large result set, let’s check the box to enable the Query Parameter and modify the filter criteria to reference the sample Workspace and Environment we created in the Setup section.  In the screenshot below I have used eost-dev as the Environment name and eost-lob1-dev as the Workspace name.  Yours will be different. Note that there is a semi-colon separator between the two criteria in the query parameter.

    name==eost-lob1-dev;environment.name==eost-dev

    The documentation and hence the generated operation may have only a comma so you will need to modify it.

    Finally, run the request by clicking on the green play arrow.

    EdwardOst_28-1731946115132.png

    The results are displayed below the green arrow so you may need to scroll down.

    EdwardOst_29-1731946130799.png

    The request output is displayed in Pretty format with collapsible elements.  It can also be displayed in raw json format as shown below.

    [
        {
            "id": "66ba3b68aea0c50341661f68",
            "name": "eost-lob1-dev",
            "owner": "eost",
            "type": "custom",
            "environment": {
                "id": "66ba3b67aea0c50341661f67",
                "name": "eost-dev",
                "maxCloudContainers": 0,
                "default": false
            }
        }
    ]

    The result of the List Workspaces operation is an array of one element which is the eost-lob1-dev Workspace.  It has a child element which is the eost-dev Environment.  For both the Workspace and Environment there is an id property in addition to the human readable name.  We will need the Workspace id returned from this operation as input to the next step.

    Get Available Tasks

    Select the Get Available Tasks operation from the Run Task by Name scenario in the left-hand pane to edit it.  In the right hand-pane add a request Header named Authorize and set the value of the Authorize Header to “Bearer ${tmc_token}“ just like the previous request.

    EdwardOst_30-1731946273868.png

    Check the box to enable the Query Parameter.  There are lot of Query Parameter options for the Get Available Tasks.  Check the boxes next to the environmentId, workspaceId, and name query parameters to enable them.  The screenshot below shows the query parameters already populated, but they will initially be empty.

    EdwardOst_31-1731946289980.png

     

    Click in the environmentId query parameter text field and then click the wand icon next to it to build the Expression to return the environmentId.  The Expression Builder dialog window will be displayed.  Use it to drill into the previous List Workspaces result to retrieve the Environment id.

    On the left-hand pane of the Expression Builder there are different sections for Projects, Environment variables, and Global Methods.  The Projects section has the title “Repisitory MyDrive”.  Find the TMC API project and click on it.  In the right-hand pane you will see the expanded results, which include the Run Task by Name that you created.  Click on it and you will se the different Requests you have created within the scenario.  Click on the previous step, List Workspaces.  Now drill into response->body->0.  The 0 is the first element of the array that was returned as the response.  Continue drilling into the environment->id elements.  Since id is a string an additional drilldown is possible to further parse the string, but we do not need it.

    The full drill down as well as the final expression are shown below.

    EdwardOst_33-1731946331615.png

     

    EdwardOst_34-1731946337610.png

     

    EdwardOst_35-1731946346688.png

     

    EdwardOst_36-1731946352422.png

     

    Repeat this process for the WorkspaceId field.  The drill down and final expression are shown below.

    EdwardOst_37-1731946372174.png

     

    EdwardOst_38-1731946378939.png

     

    EdwardOst_39-1731946384978.png

     

    EdwardOst_40-1731946389610.png

     

    For the name field, just specify the human readable name of your sample Task.  In the screenshots this has been tmc_sample_job but you can use any job you wish.

    Finally, run the request by clicking on the green play arrow.

    EdwardOst_41-1731946412831.png

     

    The request output is displayed in Pretty format with collapsible elements.  It can also be displayed in raw json format as shown below.

    {
        "items": [
            {
                "executable": "6723bd08708ce135e8d12bf3",
                "name": "tmc_sample_job",
                "workspace": {
                    "id": "66ba3b68aea0c50341661f68",
                    "name": "eost-lob1-dev",
                    "owner": "eost",
                    "type": "custom",
                    "environment": {
                        "id": "66ba3b67aea0c50341661f67",
                        "name": "eost-dev",
                        "default": false
                    }
                },
                "artifactId": "6723bd080bd9b9551259eccc",
                "runtime": {
                    "type": "REMOTE_ENGINE",
                    "id": "66cccd70a0ca60412e39e758",
                    "runProfileId": ""
                }
            }
        ],
        "limit": 100,
        "offset": 0,
        "total": 1
    }

    The result of the Get Available Tasks operation shows tmc_sample_task.  In addition to the human readable name property, the Task has a unique key which is the executable property.  The Task has links to the Environment and Workspace as well as the underlying Job (Artifact).  For both Workspaces and Environments there is an id in addition to the human readable name.  We will need the value of the executable property returned from this operation as input in the next step.

    Execute Task

    Select the Execute Task operation from the Run Task by Name scenario in the left-hand pane to edit it.  In the right hand-pane add a request Header named Authorize and set the value of the Authorize Header to “Bearer ${tmc_token}“ just like the previous request.

    EdwardOst_42-1731946512361.png

    Unlike the previous List Workspaces and Get Available Tasks which were GET operations, Execute Task is a POST operation.  It is creating a new Execution which is being appended to the list of active Executions.

    Since it is a POST operation, we need to look at the Execute Task API documentation to understand the schema of the request body.  The request body is an ExecutableTask.  The  ExecutableTask object has four properties: executable, parameters, logLevel, and timeout.  We will only use the first three properties for our example.

    In the Body of the request paste in the following json template.

    {
      "executable" : "",
      "parameters" : {},
      "logLevel": "INFO"
    }

    We will use the Expression Builder to populate the executable property with the value returned from the previous Get Available Tasks Operation.  First position the cursor between the two quotes for the executable value.  Then click the wand icon.

    EdwardOst_43-1731946598755.png

    Now navigate to the previous Get Available Tasks in the Expression Builder and drill into the desired executable property of the result.  The drilldown navigation and the resulting expression are shown below.

    EdwardOst_44-1731946613678.png

    EdwardOst_45-1731946620977.png

    EdwardOst_46-1731946640691.png

    EdwardOst_47-1731946647347.png

    After clicking Insert in the Expression Builder dialog window, the resulting request should like this:

    {
      "executable" : "${"TMC API"."Run Task by Name"."Get available Tasks"."response"."body"."items"."0"."executable"}",
      "parameters" : {},
      "logLevel": "INFO"
    }

    Now it is time to enter parameters.  Parameters depend on the Context Variables used by your job.  Any Context Variable default values you have in your job are overridden by the Task configuration properties.  Those Task configuration properties can in turn be overridden by the parameters specified in you API call.  So if you are happy with the already defined defaults in either the Task or the Job you can omit the parameters property of the Request.

    Parameters for your sample job will differ, but the tmc_sample_job example has just a single Context Variable called message which is a String.  The request is shown below.

    {

      "executable" : "${"TMC API"."Run Task by Name"."Get available Tasks"."response"."body"."items"."0"."executable"}",

      "parameters" : {

    "message" : "Greeting Earthling"

    },

    "logLevel": "INFO"

    }

    Run the request by clicking on the green play arrow.  The result of the ExecuteTask operation is only a link to the ExecutableTask that was created.  

    {
        "executionId": "ab9c894d-3664-4a71-bce6-066f055937bc"
    }

    The operation provides an asynchronous interface, so you can poll the status of the Task execution with the Get Task Execution Status operation which uses the exectionId as part of its path.

    Running the Scenario

    All three operations in the Run Task by Name scenario are now working and wired together.  You can run the operations sequentially by clicking on the Scenario play button in the left hand side.

    EdwardOst_48-1731946864146.png

    The API Tester will attempt to run each operation in sequence.  New results for prior steps will be incorporated via the Expression Builder into subsequent steps.  The outputs will be available in the individual operations to review.

    Using a Service Account

    Regular user accounts for humans use User Access Tokens which are static.  In contrast, Service Accounts must generate temporary access tokens based on the oAuth2 Client Credential Flow.  The temporary token itself can be generated by calls to the Get JWT Token operation of the TMC oAuth API.  This example assumes that you already have a service account created with appropriate permissions to access your sample workspace as described in the Pre-requisites section.

    Get the Service Account Temporary Token

    From the TMC oAuth API click select the appropriate region for your Talend Cloud and then click “Try in API Tester”.

    EdwardOst_49-1731946933934.png

    A new project named oAuth 2021-03 is created.  Get JWT Token is the only one operation in the API.  Click on the ellipsis next to the oAuth 2021-03 project and select “Extract to Scenario”.

    EdwardOst_50-1731946951026.png

    Select the Get JWT Token operation by clicking on the checkbox and then click Extract To.

    EdwardOst_51-1731946964860.png

    Navigate to the Run by Task Name scenario as in the previous sections and click Save.

    EdwardOst_52-1731946981663.png

    The Get JWT token operation request has been appended to the Run Task by Name scenario.

    EdwardOst_53-1731947004344.png

    Since we will need our JWT Service Account token for the subsequent steps, click on the Run Task by Name scenario and click the pencil icon to edit the scenario.

    EdwardOst_54-1731947018565.png

    Move the Get JWT token to be the first operation using the up-arrow keys.

    EdwardOst_55-1731947045833.png

    Notice that when you imported the oAuth API into API Tester the Environment was changed.  The environment is shown in the upper right.  Change the environment to the same TMC_API environment used for the other requests.

    Now click on Edit Request for the Get JWT Token operation.  Notice that the Authorization header for the request expects an Environment variable named PublicAuthorizationHeader.

    EdwardOst_56-1731947061331.png

    As noted in the Generating a Service Account Token use case format Authorization header needs to be the base64 representation of the service account id concatenated with a colon and then the service account secret.  If you do not know the service account id you can look it up in the TMC from Users and Security -> Service Accounts as shown below.

    EdwardOst_57-1731947077764.png

    You could also look it up programmatically using the Service Accounts API with the List Service Accounts operation.

    In addition the service account id, you need the service account secret.  That was displayed when the service account was created, and you have it stored in a secure place, e.g. a secrets manager.  If you have lost access to the secret you will need to generate a new service account.

    With the service account id and secret in hand you can create the Public Header Authorization environment variable.  Click the pencil icon in the upper right corner to edit the TMC API Environment.

    EdwardOst_58-1731947099188.png

    Add a new private environment variable called PublicHeader Authorization as shown below.  Set it to plaintext (not Base64) value of <service account id>:<service_account_secret>.  Note the colon between the two values.  We will format this in base64 in the next step.

    EdwardOst_59-1731947115831.png

    Back in the Get JWT Token request editor, select the Authorization header.

    EdwardOst_60-1731947132756.png

    Prefix it with the word “Basic”.  It should now read:

         Basic ${"PublicHeaderAuthorization"}

    Now we need to transform this to Base64 format.  Click anywhere within the quotes surround PublicHeaderAuthorization and click the wand icon to use the Expression Builder.

    The Expression Builder displays the PublicHeaderAuthorization environment variable.  In the Methods column select base64.  The resulting expression is shown below, as well as a preview of your base64 formatted service-account-id:service-account-secret pair.  Keep in mind that base64 is just a format, not an encryption.  So be sure that your environment variables are private and do not share even the base64 value.  It is obfuscated, not encrypted.

    EdwardOst_61-1731947152551.png

    The payload for the request is always the same.  Substitute your API region (us, us-west, eu, ap, au) for the <env> placeholder below and place in the Body section of the form.

    {
      "audience":"https://api.<env>.cloud.talend.com",
      "grant_type":"client_credentials"
    }

    EdwardOst_62-1731947184428.png

    Now click on the green play arrow to execute the operation.  The result is an access token in json format as shown below.

    {
        "access_token": "--- big long key redacted ---",
        "token_type": "Bearer",
        "expires_in": 1800
    }

    Note that the token will expire in 1800 seconds (30 minutes).

    Using the Service Account Temporary Token

    The new temporary Service Account token must be used for our subsequent queries.  We will need to update the Authorization header of the List Workspaces, Get Available Tasks, and Execute Task operations.

    Select the List Workspaces operation within the Run Task by Name scenario.  Select the Authorization header, click within the quotes surrounding tmc_token and click the wand to open the Expression Builder.

    EdwardOst_63-1731947265616.png

    The Expression Builder opens and displays the previously select tmc_token environment variable.  We need to change that.  Select the TMC API project from the left hand pane, and then the Run TMC Task by Name->Get JWT Token->response->body->access_token as shown below.

    EdwardOst_64-1731947279246.png

     

    EdwardOst_65-1731947284877.png

     

    EdwardOst_66-1731947290456.png

     

    After clicking the Insert button, the new Authorization property should read

    Bearer ${"TMC API"."Run Task by Name"."Get JWT token"."response"."body"."access_token"}

    Click the green play arrow to execute the List Workspaces to verify that the new Service Account based invocation works.  If you get an authorization error, double check that you have your service-account:service-account-secret pair set correctly in the environment, and that your service account has correct permissions on the Workspace.

    For the Get Available Tasks and the Execute Task operations of the Run Task by Name scenario you can just copy-paste the same Authorization expression from the List Workspaces Authorization shown above.

    Execute those operations as well to confirm the end-to-end test.

    Running the Scenario

    All three operations in the Run Task by Name scenario are now working and wired together.  You can run the operations sequentially by clicking on the Scenario play button in the left-hand panel.

    EdwardOst_67-1731947377968.png

    The API Tester will attempt to run each operation in sequence.  New results for prior steps will be incorporated via the Expression Builder into subsequent steps.  The outputs will be available in the individual operations to review.

     

     

    Show Less
    EdwardOst
  • Image Not found
    blog

    Support Updates

    Watch Q&A with Qlik: New to Qlik Cloud

    Don't miss our previous Q&A with Qlik! Hear from our panel of experts to help you get the most out of your Qlik experience.     SEE THE RECORDING HERE... Show More

    Don't miss our previous Q&A with Qlik! Hear from our panel of experts to help you get the most out of your Qlik experience.

     

     

    SEE THE RECORDING HERE

     

    QnARecording.png

    Show Less
    Troy_Raney
  • Image Not found
    blog

    Product Innovation

    Automate your Machine Learning with Qlik Talend Cloud Data Pipelines

    In today's data-driven world, organizations are increasingly leveraging Machine Learning (ML) to extract valuable insights from their data. This power... Show More

    In today's data-driven world, organizations are increasingly leveraging Machine Learning (ML) to extract valuable insights from their data. This powerful technology enables businesses to make data-driven predictions, classify data, and uncover hidden patterns. As a result, ML can provide a significant competitive advantage, improve operational efficiency, and enhance customer experiences. 

    However, implementing ML can be challenging. Two key obstacles often arise: poor data source quality and inefficient ML model development. Overcoming these problems requires providing accurate and complete source data for effective ML model training. Also, reducing the time-consuming and resource-intensive effort needed to develop and deploy these models. 

    To overcome these challenges, organizations need a streamlined approach to integrate high-quality data with ML capabilities. Qlik simplifies this process, making it easier to build data pipelines that leverage the power of ML to drive better business outcomes. 

    Introducing Qlik AutoML , Qlik Application Automation and Qlik Talend Cloud 

    Qlik AutoML automates machine learning by using classification or regression models to find patterns in data that can be used for predictions. Qlik AutoML trains and tests your machine learning experiments, making them ready for deployment. These machine learning models can be integrated within Qlik Sense applications, Qlik Automation workflows and external applications.  

    When configuring Qlik AutoML experiments, you select the target and features used within the predictive model.  Qlik AutoML automatically preprocesses trains and optimizes the model with the use of automatic feature engineering based on your choices.  Once the experiment is complete, your ML models can be deployed through APIs for real-time predictions. Qlik AutoML facilitates an iterative workflow by enabling you to tune your model parameters for better optimization.

    Qlik Application Automation is a powerful tool that enables you to automate data, analytics and ML processes without writing any code. It provides a visual interface where you can easily create and manage automations, consisting of a sequence of actions and event triggers. Automations are a simple way to solve the data consumption use case for ML models where users can choose from templates or build their own workflows by assembling predefined connectors and logical blocks.

    Qlik Talend Cloud 

    Bringing it All Together 

    Qlik Cloud capabilities allow you to use automation to create a workflow that acquires data from any supported source into a target with real-time predictions and load data visualizations within a dashboard application.  

    We will demonstrate first using a  Titanic passenger survival data set to create and deploy a classification model utilizing AutoML. A data pipeline will be used to ingest and transform source data that can be used for predicting if a passenger would have survived the Titanic. An application automation will be created to invoke the classification model in real-time and update the resultant passenger survivor data in a QlikSense application. 

     The Qlik platform can show an example of providing predictions to a data integration pipeline with the use of automation. (Using the Titanic dataset from Kaggle, we can build a Qlik data pipeline that can predict which passengers survived the Titanic.) 

    Setting up and running Qlik Talend Cloud Services 

    Build a classification prediction Model using Qlik AutoML experiment using the Titanic data set. (Choose deployment model based on F1 score.) 

    damienedwards_0-1730828534166.png

     

     
    Deploy the generated CatBoost Classification model to predict survivors in our workflow using a Real-time prediction API URL.  

    damienedwards_1-1730828534169.png

     

     

    Qlik Talend Cloud Data Integration pipeline used to load source data from MySQL and transform the data for model predictions into a Snowflake target.  

    damienedwards_2-1730828534170.png

     

     

    Create a Qlik Automation to invoke the Qlik ML prediction model on the Titanic Transformed dataset created in the QTC Data pipeline. 

    QLIK AUTOML CONNECTOR USING THE DEPLOYED TITANIC CLASSIFICATION API WITH FEATURES SHOWN BELOW 

     

    damienedwards_3-1730828534171.png

     

    damienedwards_4-1730828534172.png

     

     

    Qlik Application Automation workflow sequence with embedded processor blocks. 

    • Start Data Integration Pipeline data task to load dataset used for predictions. 
    • The transformed dataset will call the deployed ML Flow Titanic Model API to generate classification prediction for passenger survival. 
    • Load model predictions into Qlik Sense dashboard  damienedwards_5-1730828534172.png

             

    Qlik Sense Application Dashboard loaded with Real-time prediction data. 

    damienedwards_0-1730828884841.png 

     

    Conclusion 

    Qlik Talend Cloud delivers real-time prediction capabilities by adding machine learning to your data pipelines. The Qlik Application automation features make it easier to integrate the services Qlik Talend Cloud provides for data integration and analytics. The platform reduces the complexities of deploying ML models within your data pipeline and integrate the results for your Analytical application. Organizations can quickly adopt the power of machine learning within their enterprise data architecture with the Qlik Talend Cloud platform. 

    Show Less
    damienedwards
  • Image Not found
    blog

    Qlik Academic Program

    India leads with the highest demand for data analytics skills globally

      India leads with 17.4% of its job postings looking for data analytics skills, followed by the US at 8.8%, positioning these countries... Show More
     
    India leads with 17.4% of its job postings looking for data analytics skills, followed by the US at 8.8%, positioning these countries as global hubs for data analytics expertise,” it added. Other markets with high demand for data analytics skills include UK (7.5%), New Zealand (6.7%), and Australia (6.4%).

    This is according to a recent report published by Cornerstone on 2024 Global State of the Skills Economy.

    Read more about this report here: https://www.financialexpress.com/jobs-career/india-leads-in-demand-for-data-analytics-globally-global-state-of-the-skills-economy-report-3620853/ 
     
     

     

    Show Less
    Pankaj_Muthe
  • Image Not found
    blog

    Design

    Let's Take a walk on the Calendar / Date Bridge: Working with Multiple Dates / ...

    Do you need answers for specific points in time when working with multiple calendars / dates? Then use a calendar bridge. A calendar bridge is used to... Show More

    Do you need answers for specific points in time when working with multiple calendars / dates? Then use a calendar bridge. A calendar bridge is used to create what Qlik commonly calls: a Canonical Date / a Canonical Calendar. A calendar bridge is nothing more than a simple table that links 1 or more dates to a single common date, called a canonical date. This is used to simplify time period selection during analysis and when multiple calendar / date filters can be confusing to the user.  The bridge table is linked to a key field in your data and created with a new dimension to simply describe each date type you have. (You can link this to a Master Calendar if you require more granular time periods.) Your charts can then use aggregated measures with the defined date type in a set expression to show the specific results. Watch the video below to learn more and see the attached app and sample data if you want to try it yourself. 

    Want to learn more tips and tricks likes these? Don't forget to join me tomorrow 10AM ET for Set Analysis: Redux on the next Do More with Qlik Webinar

    Register here

    2023-10-31_11-12-57.jpg

    This article by our beloved HIC is a great reference with more detail if needed.

    Part 2: Coming soon....

    Want to learn more tips and tricks likes these? Don't forget to join me tomorrow 10AM ET for Set Analysis: Redux on the next Do More with Qlik Webinar

    Register here

    Show Less
    Michael_Tarallo
  • Image Not found
    blog

    Product Innovation

    Qlik Data Integration Client Managed November 2024 General Availability Release

    Qlik Data Integration Client Managed November 2024 General Availability Release November traditionally brings many celebrations, from Diwali and Guy ... Show More

    Qlik Data Integration Client Managed November 2024 General Availability Release

    November traditionally brings many celebrations, from Diwali and Guy Fawkes Day to Thanksgiving, just to name a few. Another celebration to add to that long list is the General Availability of the November 2024 releases of Qlik Replicate and Qlik Enterprise Manager. Our Qlik Replicate customers looking to empower their SAP data will especially want to celebrate, as we now support OData for sourcing data from SAP systems.

     

    Qlik Replicate November 2024 General Availability Release

    New Endpoints

    Qlik is uniquely placed by offering several methods of replicating data with ease and automation and the added ability to combine it with other data seamlessly. SAP and Oracle continue to be mission-critical systems for many organizations, and two new endpoints have been added to enable the most efficient ways to replicate data from these two powerhouses.

    New SAP OData source endpoint

    Qlik Solutions has supported SAP as a source endpoint for a very long time. We have deep domain knowledge and expertise, and through this, we offer many different ways to source and extract data from your SAP applications that best match your use cases, latency requirements, and SAP licensing considerations.

    In this release, we are excited to announce the addition of the new SAP OData source endpoint. We purposely developed this new endpoint with a keen eye on performance and scalability, and it supports Change Data Capture (CDC). The new OData endpoint uses a secured web service connection to extract data from SAP applications.

    Adam_Mayer_0-1731660114522.png

     

    The new JAVA source endpoint uses the OData v2.0 protocol, consistent with SAP support, to explore SAP and get data out. All ODP-based services, such as dynamic filters and projection list forwarding, are supported to help reduce the load and impact. Objects supported included CDS Views, Extractors, BW data providers, and SLT. Data movement with CDC can be supported via two options: configurable periodic time periods or running a task on a schedule using Qlik Replicate’s built-in scheduler function.

    See the online help for using SAP OData as a source

    The OData endpoint is coming to Qlik Talend Cloud soon, through the Data Movement Gateway, stay tuned.

    Qlik will continue supporting all our existing SAP source endpoints to serve customers' requirements. 

    To learn more about all the many ways that we support SAP and other sources in Qlik Replicate and how to use them, check out the online help section on managing sources 

     

    New Oracle XStream source endpoint

    We are pleased to announce the new Oracle XStream source endpoint, which brings several improvements over using only the Oracle Redo logs for extracting data. This new endpoint interfaces directly with the Oracle XStream API, so now a Replicate task creates an XStream Out Server. With this new method, you will experience better performance, increased reliability, and simplified maintenance. Additionally, by utilizing the API, we can ensure future-proofing the endpoint with later Oracle versions.

    Adam_Mayer_1-1731660114525.png

     

    See the online help for using Oracle XStream as a source

    There have also been several improvements across several other endpoints, such as

    Source endpoint improvements

    • MySQL sources - LOB performance improvement
    • SAP HANA Endpoint – Trigger-based support “Full Record” mode

    Target endpoint improvements

    • Databricks Delta - new staging option: Databricks Volume
    • Snowflake target - Snowpipe Streaming improvements
    • IBM DB2 for z/OS target – added new ZLOAD options
    • DDL History control table - expanded support for 11 additional target endpoints

    Scheduling enhancements

    The Qlik Replicate Scheduler is used to schedule Replicate task operations such as running, stopping, reloading, and resuming tasks as one-time or recurring jobs. Schedules can be configured to occur once, daily, weekly, or monthly.

    This release introduces two new options, monthly and every, to make scheduling even more flexible.

    Monthly – gives you more granular options to schedule tasks to run on the <nth> <weekday> of every month and at the specified time.

    Adam_Mayer_2-1731660114532.png

     

    Every - This lets you schedule tasks to run at regular intervals. This new option allows you to control the interval, starting on the specified date and time.

    Adam_Mayer_3-1731660114535.png

     

    Note: The minimum interval is 5 minutes, and job intervals are always calculated according to the original start time.

     

    Security and Compliance

    Qlik has always taken trust and security seriously, implementing security and privacy by design in our products for a long time. We offer a world-class architecture and experience to meet your security, compliance, and privacy needs confidently.

    The following endpoints have enhanced access and authentication methods

    • Azure Database for PostgreSQL – New Azure AD authentications
      Added two new Active Directory authentications
      • Service Principal – when Qlik Replicate is running on-premises
      • Managed Identity – when Qlik Replicate is running on Azure VM
    • IBM DB2 for z/OS target – added SSL
      Supports encrypted communication between Replicate and DB2, applies to both ODBC and JDBC connections
      Adam_Mayer_4-1731660114537.png

       

    • Snowpipe Streaming improvements - Support for OAuth authentication

     

     

    Qlik Enterprise Manager May 2024 General Availability Release

    Enhanced API Support - We continue to extend the Qlik Enterprise Manager APIs by now supporting the ability to resume change processing from a position in the log (SCN or LSN) using the RunTask method. (RESUME_PROCESSING_FROM_POSITION)

    We have also enabled API authentication using access tokens via JSON Web Token (JWT).

    ⚠️ Note: the requires JWT to be setup and configured ⚠️

    All our APIs can be used as REST, .NET, and Python. More details can be found on Qlik Help. Qlik Enterprise Manager API guide

     

    As always, each new release is fully supported for two years. To check the status of support for your currently installed version, please see the relevant product lifecycle pages.

    • Qlik Replicate Product Lifecycle
    • Qlik Compose Product Lifecycle
    • Qlik Enterprise Manager Product Lifecycle

    We hope you enjoy using Qlik Data Integration products and would love to hear your feedback and success stories, especially in any improvement gains you achieve.

    To get the latest versions, please visit the Downloads and Release Notes section on Qlik Community.

    To learn more about what is included in these releases be sure to check out the Release notes which are available here

     To obtain any of these releases, go to the Qlik Downloads Site in the Community and filter “Product Category” by “Qlik Data Integration”, and then select the product and the versions you would like to download.

    Note: For most products, selecting “Latest release and patch” under the “Show Releases” should be enough.

    If required, you can filter further by selecting the latest “Release” and/or Service Release (SR) version under “Release Number”.

    Adam_Mayer_5-1731660932816.png

     

    Show Less
    Adam_Mayer
  • Image Not found
    blog

    Support Updates

    Introducing Qlik Cloud Tabular Reporting

    We are excited to introduce tabular reporting from Qlik Cloud. Now customers can address common, centrally managed, tabular report distribution requir... Show More

    We are excited to introduce tabular reporting from Qlik Cloud. Now customers can address common, centrally managed, tabular report distribution requirements from within a Qlik Sense Application! With tabular reporting, report developers can create custom and highly formatted XLS documents from Qlik data and Qlik visualizations; Governed Report Tasks can burst reports to any stakeholder, ensuring that the Qlik platform is the source for operational decisions, customer communications and more.

     

    Some feature highlights:

    • Report template creation, using data and visualizations from a Qlik Sense App, all with the familiarity of Microsoft Office 365 using Add-in technology
    • Qlik Cloud governed report task control from within a Qlik Sense App
    • In app distribution list management to support burst report distribution to any stakeholder (internal or external)
    • Execution of Qlik NPrinting authored XLS report templates uploaded to Qlik Cloud
    • Powered by the Qlik Reporting Service, reports are delivered from a scalable cloud service that solves complex enterprise reporting jobs

     

    Want to get started with your first Tabular Report?

    Access our Getting Started section in your Qlik Cloud app (available for users with Can Edit permissions). Open your app, (a) choose your activity, and select (b) Reporting.

    access reporting.png

    From here, you can begin with our introductory videos and configuration instructions:

    getting started with tabular reporting.png

     

    Want to know more about Qlik’s Tabular Reporting feature?

    • See the Tabular Reporting introductory videos
    • To get started see Tabular reporting in Qlik Cloud Analytics
    • (new!) The future of Qlik Cloud Reporting: STT - Qlik Cloud Reporting Evolution 
    • To access and discuss the Qlik NPrinting Technical Preview, go to Qlik's Technical Previews 
    • The Qlik Excel add-in can be deployed and installed for compatible web and desktop versions of Microsoft Excel within Microsoft 365 see Deploying and installing the Qlik add-in for Microsoft Excel and speak with your Office 365 administrator if you wish to deploy the capability
    • Generating tabular reports in Qlik Cloud Analytics is a value-add Qlik Reporting Service capability. Check with your service account owner about your Qlik Cloud subscription's included capacities. Please be aware that overage will be monitored and capped starting in 2024.
    • Adopting customers should familiarize themselves with guardrails and limits see Qlik Reporting Service specifications and limitation

     

    Thank you for choosing Qlik,
    Qlik Support

     

    Show Less
    Sonja_Bauernfeind
  • Image Not found
    blog

    Product Innovation

    Qlik Cloud Reporting's Perfect Update

    It’s been an exciting year for Qlik Cloud Reporting. Back in December 2023, we took a big step by adding tabular reporting to our dashboard-style capa... Show More

    It’s been an exciting year for Qlik Cloud Reporting. Back in December 2023, we took a big step by adding tabular reporting to our dashboard-style capabilities, something many of you have been asking for. This update comes packed with features like report task management, easy imports of recipient lists from connected data sources, powerful filtering using bookmarks, and simple uploads for tabular report templates. Now, report developers can easily handle centralized tabular reporting tasks right inside Qlik Sense apps. Thanks to our incredible Qlik Cloud customers, we’re gearing up to roll out even more cool features!

    We’re putting the finishing touches on our cloud reporting capabilities for Q4:

    • Report Cycling: Developers can set up report tasks to cycle through a dimension, creating reports for each value.
    • Report Task History: Developers will have better visibility into past task runs with a new report task history feature that keeps track of up to three months of execution data. Plus, we’ll surface error and warning messages to help troubleshoot any issues.
    • PixelPerfect Authoring: We’re introducing a new template authoring feature for PixelPerfect reports! Create advanced document templates that meet strict customer or compliance needs using Qlik data, visualizations, or even a library of print-ready visuals.

     

    As BI leaders get ready for Qlik Answers, Qlik AutoML, and Qlik Talend Cloud, we know that having flexibility in report types and control over operations is key for keeping everyone in the loop.

    As we launch these new features, we’ll also be focusing on enhancing operational task controls, including email reporting solutions.

    Check out our free e-learning module today!

    Webinar to explore Qlik Cloud’s Reporting Evolution and get a sneak peek at everything it has to offer!

    Qlik Insider Webinar on November 13th hosted by Qlik’s VP of Analytics Portfolio Marketing, Mary Kern, this episode of Qlik Insider dives into to the role of reporting in the world of modern BI. 

    Resources:

    • Designing PixelPerfect report templates | Qlik Help YouTube
    • PixelPerfect Report Authoring | SaaS in 60
    • Course: Creating Reports with PixelPerfect Editor | Qlik Learning

    Show Less
    QlikProductUpdates
  • Image Not found
    blog

    Support Updates

    Talend 8 – Java & Camel upgrade coming early 2025

    To ensure continuous support for your data integration processes and to leverage the latest innovations, we are providing this advance notice that Jav... Show More

    To ensure continuous support for your data integration processes and to leverage the latest innovations, we are providing this advance notice that Java 17 and Camel 4 will become the new standard versions across Talend Data Fabric version 8. We initiated a transition process by introducing Java 17 support in October 2023, and we are now completing the last leg of this transition. 

    As of the February, 2025 release: 

    • Talend Studio will perform new builds of Jobs and Services only in Java 17, and Route builds will rely on Camel 4 instead of Camel 3. 
    • Talend Runtime will only support Java 17 and will run Routes with Camel 4. 

     

    To make this transition smoother: 

    • In January 2025, JobServer and Remote Engine will be improved so they can optionally run each Job or Service in its corresponding Java version (8, 11 or 17) automatically, providing better support for the heterogeneity inherent to a transition period. 
    • In February 2025, Studio will also propose you opt for a version that remains on Java 8/11 and Camel 3. This version will only be supported until August 31, 2025 and for “Severity 1 Errors”. 
    • JobServer and Remote Engine will continue to support running Jobs and Services in Java 8 and 11, until December 31, 2026 

     

    Step-by-step guides will be made available when the new versions of these components become available. 

    For further questions please contact Qlik Talend Support and subscribe to our Support Blog for future updates. 

     

    Thank you for choosing Qlik, 

    Qlik Talend Global Support

    Show Less
    Katie_Davis
  • Image Not found
    blog

    Design

    Using Qlik-cli to automate workflows

    What is Qlik-cli?  Qlik-cli is a command line interface for Qlik sense SaaS, providing access to all public APIs. The tool enables you to administer y... Show More

    What is Qlik-cli? 

     

    Qlik-cli is a command line interface for Qlik sense SaaS, providing access to all public APIs. The tool enables you to administer your tenant, develop and manage apps, migrate data, making it easier to script and automate workflows.  

    Qlik-cli is mainly developed for Qlik Sense SaaS where the aim is to support all the publicly exposed APIs. However, you can expect functionality to support migrating resources for example apps to Qlik Sense SaaS for Qlik Sense Enterprise on Windows. 

    Here is an overview of how you can use the tool. 

    qlik-cli_help.png

     

    How to get started  

     

    To get started, simply install the tool. Below is a short video that walks you through the installation process.  

    I hope you will try the tool soon and when you do, let us know what you think and if you are already using and loving the tool also share with us your favourite use cases.  

     

    Further reading: 

    https://qlik.dev/tutorials/get-started-with-qlik-cli 

    https://qlik.dev/libraries-and-tools/qlik-cli 

     

    Show Less
    Gertrude
  • « Previous
    • 1
    • …
    • 26
    • 27
    • 28
    • …
    • 142
  • Next »

Products

  • All Products
  • Qlik Cloud Platform
  • Qlik Sense
  • Qlik AutoML
  • QlikView
  • Qlik Replicate
  • Qlik Compose for Data Lakes
  • Qlik Compose for Data Warehouse
  • Qlik Enterprise Manager
  • Catalog & Lineage
  • Qlik Gold Client
  • Why Qlik

Resources

  • Resource Library
  • Qlik Partners
  • Free Trials
  • Compare Qlik
  • Glossary
  • Training
  • Support

Company

  • About Qlik
  • Press Room
  • Trust & Security
  • Accessibility
  • Privacy
  • Careers
  • Global Offices
  • Contact Sales

Qlik Community

Legal

© 1993-2026 QlikTech International AB, All Rights Reserved
  • Legal Policies
  • Privacy & Cookie Notice
  • Trademarks
  • Terms of Use
  • Legal Agreements
  • Product Terms
  • Do not share my info
  • ADA Site Compliance-Accessibility Policy