Vote for your favorite Qlik product ideas and add your own suggestions.
Dears,
Due to the documentation, Calculated Condition is not supported in the NPrinting May`22. Working with Qlik objects | Qlik NPrinting Help
In fact, NPrinting works fine with the Calculated Condition in case of one user and one filter is being used.
But behaves unpredictable with multiple UserFilters which effect the Calculated Condition. See the case Case Details - Qlik Community
May I ask you to include Calculated Condition support in a future release?
According to the case above, you don't need to implement this functional from the beginning, but just to resolve an issue, described in a case
Currently we use attrep_status for task specific information like task timestamp information. Since the table is common within all our tasks, there is contention on table when multiple updates are happening for all tasks.
The current strategy / mitigation is to have separate tables for each task or group by similar tasks as well as reducing the update frequency to 5 mins. This causes lot of admin work and consolidation on data and latency in task status.
Requesting to expose this information via QEM GetTaskDetails API. This would reduce lot of admin work and contention at the target ( attrep_status table updated)
Our situation is that we have more than 100 data sources and we want to use one Azure Databricks target endpoint to move the data.
But for each database we need to use a different target directory folder. This means that we also need to create more than 100 target endpoints.
If we could also use a variable such as Task Name as the target directory, then we could solve the whole thing with a single target endpoint.
We would like to see the ability to automatically have metadata like app name, object name and current selections added to a header or footer when exports are done in Qlik Cloud for both Qlik Sense and QlikView apps.
This was really useful in QlikView but has been missed by us and our users in Qlik Cloud.
When the filter panel is used with the Aggr function to refine the filter, it operates on the specified axis of the Aggr function.
This is why our customers who rely on axes utilizing the Aggr function have requested that we disable the filter function for pivot tables.
For reference, I have included similar cases
How to remove or hide a filter control on a pivot table?
We kindly request your consideration of this matter.
Could we have the option in the Download Screen to allow a end user to cycle through the different values in a dimension and batch downloads, in any of the three export formats. It would be great to do the same and allow for an export of the whole sheet as .jpg/.pdf, again cycling through dimension values.
It would ideally appear as a dropdown to select the dimension or field from the data model. I've highlight where this would ideally appear in the screenshot attached.
Appreciate this overlaps with N-Printing/Qlik Reporting Service but providing this to the end user would give so much versatility and really unlock a lot of value for them, rather than "batch reporting" being seen as something external to Qlik or centrally run.
We leverage MS D365 CRM for many lines of business at our firm, each with their own unique CRM base URLs. In order to leverage this connector and gather data from each line of business we have to manually update the Base URL in the connector, fetch the data, store the data and then repoint the Base URL to the next instance.
This prevents us from automating any of our loads from multiple CRM environments within Qlik because when creating the connection within Qlik it points at a single instance of the connector.
If you could enhance the connector to allow the Base URL to be passed as part of the connection string we would be able to use a variable to be passed for the sitename within the Base URL
e.g. https://$(vSiteName).api.crm.dynamics.com/
Alternatively if we could save multiple connections in Qlik pointing to different base URLs that would work too.
This is an extension of what was already requested by another user here:
Based on this ideation Azure AD support was added for SQL DB, but only with user/password authentication.
Our security standards require AAD authentication using service principal, which is not supported for the SQL DB connector, and only supported for the staging store for the Synapse connector and not the Synapse connection itself.
Also we would like to take it another step further and support the retrieval of the secret from an Azure Keyvault, instead of hardcoding it in the connectors configuration.
For Qlik Replicate with SAP IQ as a target, currently we have transactional mode and batch mode. Transactional mode will not work against SAP IQ as it does not handle OLTP workloads. Batch mode is better for SAP IQ but does not give transactional integrity
This does the following
Begin tran
Delete table a
Commit tran
Begin tran
Insert table a
Commit tran
Begin tran
Update table a
Commit tran
Begin tran
Delete table b
Commit tran
Begin tran
Insert table b
Commit tran
Begin tran
Update table b
Commit tran
Begin tran
Etc
Commit tran
We’d like to raise an ideation to create a new mode “Transactional Integrity Batch Mode”
This will do the following
Begin tran
Delete table a
Insert table a
Update table a
Delete table b
Insert table b
Update table b
Etc
Commit tran
Azure Eventhub is only supported as a target on Replicate for Windows. We use Replicate on Linux and there is no Eventhub support for Linux. Instead we use the Kafka endpoint to connect to Eventhub, by using an undocumented workaround- with limitations (eg. throttling is not possible).
Currently there is no way of knowing when the cdc counters were initialized in the change processing section of the monitoring tab. You can see per table how many update/insert/delete statements were done, but it doesn't show since *when*
An enhancement for GetTableStatuses API is required:
Add base timestamp for when the counters were initialized to the table_cdc_info part of the response
Hi Guys,
Will be useful having extended searching capabilities on the search tool in Qlik Sense HUB.
for example using combination of words,
having the ability to combine wildcards ("*", etc.).
Also having the ability to search by stream including getting the search result by stream if asking of it.
Aldo.
Nprinting is a great product for pixel perfect reporting, this is a functionallity that many customers still want to use, reporting services is a poor tool for reporting, so the idea is to make nprinting available for Qlik SaaS or maybe a bridge between Qlik SaaS and NPrinting OnPremise.
Thanks
For data sources that are not Direct query, I can download the data from the table chart into Excel.
However, when using Direct Query to connect to Snowflake and create a table chart, the data download menu does not appear. The ability to download data to Excel is also required when using Direct Query.
Hello
Problem: Datasources like 0CO_OM_CCA_20, 0CO_OM_CCA_30 or 0CO_OM_OPA_20 do not support DELTA extraction in SAP application. So we must load data slices e.g. of fiscal periods with FULL mode. Each time a period has passed we must adjust the extraction filter manually.
Solution: please implement a dynamic filter option via input variables at /n/QTQVC/EXTREP -> Adjust/ Create Filter.
Benefit: Datasources with FUUL-only can be scheduled automatically.
Give the ability to monitor in the hub any visual objects (mostly relevant for Vizlib) and not just Qlik native
currently Qlik publish a list of objects that can't be monitored in the hub here - https://help.qlik.com/en-US/cloud-services/Subsystems/Hub/Content/Sense_Hub/Hub/monitor-charts-hub.htm#:~:text=You%20can%20monitor%20visualizations%20in,sheets%20and%20from%20Insight%20Advisor
In the November 2022 release in the 'App Overview' a logo of Qlik was introduced.
It also functions as 'Back-to-Home' which is the 'Hub'.
Completely unnecessary.
We know the product is from Qlik already.
Please remove it.
Hello,
It's 2022, and I was wondering if Qlik Sense can implement moveable windows. For instance, when you choose to create or modify an expression, the Expression Editor window takes up the entire screen. Developers should be able to resize or move the window to see what's behind it, to remind them what KPI they are revising, or what measure they are modifying. Or, when you start a Load Script run, move the window so you can see the actual Load Script and make sure you didn't forget to change something before running the load. This can save the developer a lot of time.
Thanks in advance,
Chris