Recent Discussions
-
How to use colormix2 if lowest number is 0
Hi all.Could someone please advise how to adjust the Rank part for colormix2 () to have a gradient starting from 0, not -1? In this case, 0 is the sma... Show MoreHi all.
Could someone please advise how to adjust the Rank part for colormix2 () to have a gradient starting from 0, not -1? In this case, 0 is the smallest possible number and should be perceived by the function as -1, that is, be in red.
I don't use colormix1 because colours from red to green should go through white colour.
My current results and expression.=Colormix2 ((Rank(total Sum (Velocity_Sales)))/(NoOfRows(Total)/2), Red(), Green(), white())
-
How to Connect Local Excels files in Qlik sense Cloud.
I have a virtual machine with some Excel (.xls) files saved in a local folder.I manually upload these files using the Data Manager in Qlik Sense to cr... Show MoreI have a virtual machine with some Excel (
.xls
) files saved in a local folder.
I manually upload these files using the Data Manager in Qlik Sense to create an app.However, when I reload the app, it fetches data from the uploaded file stored under 'Data Files' within Qlik Sense, rather than from the original source file in the local folder.
As a result, any changes made to the original Excel file (or if I replace it with a new version) are not reflected in the app after a reload.
I would like the app to always load the most recent data from the source Excel file in the local folder on the VM.
How can I achieve this?
-I also install Qlik Gateway- Direct access
-Also try to achieve this Using Qlik Data Transfer but not able to achieve -
Connecting using JWT token with QlikSense.NetSDK only works once
I am writing my own API and one of its endpoints needs to fetch data from a qlik app. My users authenticate against my API using a JWT token. I config... Show MoreI am writing my own API and one of its endpoints needs to fetch data from a qlik app. My users authenticate against my API using a JWT token. I configured a virtual proxy in qlik to connect to qlik using the same token. This works, but only once per token. This is my code:
private string GetTokenFromAuthHeader() { string authorizationHeader = HttpContext.Request.Headers.Authorization; if (string.IsNullOrEmpty(authorizationHeader) || !authorizationHeader.StartsWith("Bearer ")) { throw new InvalidParameterValueException("No bearer token found"); } return authorizationHeader.Substring("Bearer ".Length).Trim(); } private static ILocation ConnectToQlikUsingToken(string token) { ILocation location = Location.FromUri($"wss://{server}"); location.VirtualProxyPath = "myproxy"; location.AsJwtViaProxy(token, false); return location; } [HttpPost(Name = "myEndpoint")] public IActionResult Post() { string token = GetTokenFromAuthHeader(); var location = ConnectToQlikUsingToken(token); string appName = "my-appname"; var appIdentifier = LocationExtensions.AppWithNameOrDefault(location, appName); Qlik.Engine.ISession session = Session.WithApp(appIdentifier, SessionType.Default); var app = location.App(appIdentifier, session); var result = app.EvaluateEx("=Count(Task.Category)"); return Ok(result); }
Everything works well for the first request. But when I send another request to my API, authentication fails throwing this error:
System.Security.Authentication.AuthenticationException: Authentication failed. at Qlik.Engine.Communication.QlikConnection.AwaitResponseTask[T](T task, String methodName, CancellationToken cancellationToken) at Qlik.Engine.Communication.QlikConnection.AwaitResponse(Task task, String methodName, CancellationToken cancellationToken) at Qlik.Engine.Communication.QlikConnection.Ping() at Qlik.Sense.JsonRpc.GenericLocation.DisposeOnError(IDisposable o, Action f) at Qlik.Engine.LocationExtensions.Hub(ILocation location, ISession session) at Qlik.Engine.LocationExtensions.AppsWithNameOrDefault(ILocation location, String appName) at Qlik.Engine.LocationExtensions.AppWithNameOrDefault(ILocation location, String appName)
Obtaining a new JWT token and using this works. Why? Do I need to close the session somehow? I tried Dispose() on the location and app but it does not help. Any ideas?
-
Data loading crashes with 20GB qvd
Hello, I need to optimize a loading script (that I did not create) that handles QVDs ranging from 10 to 20 GB. The problem is that data retrieval take... Show MoreHello, I need to optimize a loading script (that I did not create) that handles QVDs ranging from 10 to 20 GB. The problem is that data retrieval takes a very long time and then fails due to a lack of server resources for the largest QVD (the server has 64 GB).
The script will retrieve data for the current day,
then concatenate it with the existing data in the QVD,
and finally replace the QVD with the new data. There is a "where not exists" clause to avoid duplicates.
Changes already implemented : CreateSearchIndexOnReload is set to 0 We only perform "where not exists" on data with a date within the loading period and we retrieve everything before this period
This solved the problem with daily data recovery, but due to repeated crashes, we have several days to recover and this is too much for the server for the largest qvds.
I'm currently trying to purge the data to keep only one year but the server doesn't have enough resources to do something like "load * from qvd where date > 04/17/2024 if the qvd exceeds 15gb
Also, we currently have 1.5 years of data and as i said, one of the qvd's is 20gb. That seems huge to me.
Do you have any ideas on how the script can be optimized and if the size of the qvd seems normal to you?
Thanks in advance
The script looks like this :
LET LastExecTime = if(LastExecTime, LastExecTime, MakeDate(2025,4,1));
Let ThisExecTime = now();
LET v_begin = date(LastExecTime, 'YYYYMMDD hh:mm:ss.fff');
LET v_end = date(ThisExecTime, 'YYYYMMDD hh:mm:ss.fff') ;
TEST:
LOAD
DateTime,
TagName,
Num(Value,'##,##') AS Value,
Date(DateTime) & '/' & Time(DateTime) & '/' & TagName & 'XXX' AS Key;
SQL SELECT [DateTime],
[TagName],
[Value],
FROM XXX
AND [DateTime] >= '$(v_begin)'
AND [DateTime] <= '$(v_end)';
TMP:
NoConcatenate
LOAD *
FROM [lib://XXX/TEST.qvd] (qvd)
Where DateTime >= '$(v_begin)';
Concatenate(TEST)
LOAD *
Resident TMP
WHERE not exists(Key);
DROP TABLE TMP;
Concatenate(TEST)
LOAD *
FROM [lib://XXX/TEST.qvd] (qvd)
Where DateTime < '$(v_begin)' ;
store * FROM TEST into [lib://XXX/TEST.qvd];
DROP TABLE TEST; -
Applying older credits to newer debit buckets in Qlik (aging chart)
Hi all,I’m building an AR aging chart in Qlik and need two things:Aging buckets: Current, 30, 60, 90 days based on my PeriodPostingDate (cut‑off) vs. ... Show MoreHi all,
I’m building an AR aging chart in Qlik and need two things:Aging buckets: Current, 30, 60, 90 days based on my PeriodPostingDate (cut‑off) vs. PeriodDueDate.
Credit application: Any credit (negative amount) in an older bucket should reduce the debit balance in newer buckets.
I can already do a rolling total that stays dynamic for any historic PeriodPostingDate selection:
Set Analysis for Current:
Sum({<PeriodPostingDate = {"<=202505"}>} Debit)
-
Sum({<PeriodPostingDate = {"<=202505"}>} Credit)Set Analysis for 30,60,90 Days:
Sum({<PeriodPostingDate = {"<=202505"}, PeriodDueDate = {202504}>} Debit)//PeriodDudeDate 202504, 202503,202502
-
Sum({<PeriodPostingDate = {"<=202505"}, PeriodDueDate = {202504}>} Credit)//PeriodDudeDate 202504, 202503,202502My challenge:
If I have a credit in the 90‑day bucket (−3318.05), I want that to be “spent” down in the 60‑day and 30‑day buckets (so those buckets show reduced debits), and eventually leave the net in Current. I still need to allow the user to pick any historic period and see the correct applied values.What I’ve tried:
Set analysis (above) — but it simply shows each bucket independently, not applying old credits forward.
Script‐level running totals — but that loses the bucket breakdown.
What I’m looking for:
A chart expression (or script pattern) in Qlik Sense that “carries” the negative bucket values into subsequent buckets, and remains fully dynamic for any selected period.
Alternatively, if this isn’t possible purely in the front end, I’d love to see a best‑practice script approach.
Example data:
Load * inline [
PeriodDueDate,PeriodPostingDate,Debit,Credit
202505,202504,3557.03,0
202504,202503,5832.54,0
202504,202504,0,3356.68
202503,202502,4190.19,0
202503,202503,0,3348
202502,202502,1865.67,5183.72
202412,202411,17.86,2087.09
202412,202412,1647.23,1647.23
202411,202410,27.13,0
202411,202411,0,1000
202410,202409,35.4,0
202410,202410,0,1000
202409,202408,36.25,0
202408,202407,35.92,0
202407,202406,34.46,0
202406,202405,3900.07,0
202405,202404,3888.94,0
202405,202405,6112.2,9001.14
202404,202403,1000,0
202404,202404,0,2000
202403,202403,3700,3700
202212,202211,29.16,4083.77
202211,202210,29.91,0
202210,202209,26.29,0
202209,202208,3998.41,0
202208,202207,3989.82,0
202208,202208,0,3989.82
202207,202206,3969.31,0
202207,202207,0,3964.56
202206,202205,23.06,0
202206,202206,0,4050
202205,202204,22.19,0
202204,202203,22.31,0
202204,202204,0,63.59
202203,202202,20.05,0
202202,202201,21.23,0
202201,202112,5785.85,0
202112,202112,2072.3,3858.15
202105,202105,251.86,251.86
202012,202011,19.12,3895.74
202012,202012,6415.51,6415.51
202011,202010,19.65,0
202010,202009,18.93,0
202009,202008,19.46,0
202008,202007,20.16,0
202008,202008,1827.43,1827.43
202007,202006,3798.42,0
202006,202005,3440.9,0
202006,202006,410.49,3851.39
202005,202005,12102.95,12102.95
201912,201911,28.97,3945.49
201911,201910,29.71,0
201910,201909,28.54,0
201909,201908,3858.27,0
201908,201907,31.89,0
201908,201908,0,4091.41
201907,201906,30.63,0
201906,201905,31.4,0
201905,201904,30.16,0
201904,201903,874.3,0
201903,201902,21.79,0
201902,201901,857.5,0
201902,201902,906.69,0
201901,201812,2307.05,0
201901,201901,0,1000
201812,201811,897.35,0
201812,201812,0,897.35
201811,201810,15.63,0
201811,201811,0,2059.93
201810,201809,15.01,0
201809,201808,289.82,0
201808,201807,13.2,0
201807,201806,27.36,0
201807,201807,0,2000
201806,201805,28.06,0
201805,201804,30.63,0
201805,201805,0,500
201804,201803,329.93,0
201803,201802,3847.64,0
201801,201712,0.1,0
201801,201801,0,37.45
201712,201711,13.2,0
201712,201712,1698,1711.2
201711,201710,1193.58,0
201711,201711,0,2736.49
201710,201709,1082.54,0
201709,201708,460.37,0
201708,201707,23.66,0
201708,201708,0,2955.6
201707,201706,22.72,0
201706,201705,27.29,0
201706,201706,0,500
201705,201704,26.2,0
201704,201703,474.41,0
201703,201702,24.46,0
201703,201703,0,500
201702,201701,3417.77,0
201701,201701,700,757.91
201612,201611,703.87,0
201612,201612,293.13,1000
201611,201610,32.13,0
201611,201611,0,4014.22
201610,201609,30.85,0
201609,201608,31.63,0
201608,201607,31.37,0
201607,201606,1571.96,0
201606,201605,3759.78,0
201605,201604,2276.5,0
201605,201605,0,2160
201604,201603,6.12,0
201604,201604,0,2345.1
201603,201603,779,0
201602,201603,0,0.02
]; -
Error: Invalid stardust hook call. Hooks can only be called inside a visualizati...
Hi!This is my index.ts for my Qlik Sense extension which I am trying to build with NebulaJS / React.I am getting the error Error: Invalid stardust hoo... Show MoreHi!
This is my index.ts for my Qlik Sense extension which I am trying to build with NebulaJS / React.
I am getting the error Error: Invalid stardust hook call. Hooks can only be called inside a visualization component inside the component().
My index.ts:
/* eslint-disable react-hooks/rules-of-hooks */ import { useApp, useElement, useEmbed, useInteractionState, useKeyboard, useModel, useRect, useSelections, useStaleLayout, useTranslator, useTheme } from "@nebula.js/stardust"; import { useEffect } from "react"; import qae from "./qae"; import useReactRoot from "./hooks/use-react-root"; import { renderFormBuilder } from "./Root"; import { ExtendedLayout, Galaxy } from "./types"; const ext = (env: Galaxy) => ({ definition: qae, support: { snapshot: true, export: true, exportData: true, }, }); export default function supernova(env: Galaxy) { return { qae, ext: ext(env), component() { const rootElement = useElement(); const layout = useStaleLayout() as ExtendedLayout; const app: any = useApp(); const model: any = useModel(); const interactions = useInteractionState(); const translator = useTranslator(); const selections = useSelections(); const keyboard = useKeyboard(); const rect = useRect(); const embed = useEmbed(); const theme = useTheme(); const reactRoot = useReactRoot(rootElement); useEffect(() => { renderFormBuilder({ reactRoot, rootElement, layout, app, model, interactions, translator, selections, keyboard, rect, embed, theme, }); }, [ reactRoot, rootElement, layout, app, model, interactions, translator, selections, keyboard, rect, embed, theme ]); return null; }, }; }
Anyone has any idea what I am doing wrong? Been bashing my head over this for a while. 😉
Tks a bunch!
-
Batch apply mode and record order
Hi,I'm hoping someone can help me with this question. When replicating to a change table exclusively, and batch processing mode is enabled, are change... Show MoreHi,
I'm hoping someone can help me with this question. When replicating to a change table exclusively, and batch processing mode is enabled, are changes from the source applied in the same sequence to the change tables in the target? I'm aware that batch apply mode may not apply the changes in order as is done with transactional apply mode.
For example - Record A is updated, then record B is updated. Is it possible that when batch apply mode is selected, record B will arrive in the change table before Record A?
Help with this question is appreciated as always. Thank you!
Regards,
Mohammed
-
Not able to discover tables under Metadata in qlik compose
Hi, We have replicate + compose architecture where replicate is loading data on landing layer and compose on Bronze layer in AWS Databricks unity Cata... Show MoreHi,
We have replicate + compose architecture where replicate is loading data on landing layer and compose on Bronze layer in AWS Databricks unity Catalog.
The tables created by Replicate in Landing layer are not displaying when discovering in Metadata section of compose.
If we try to discover same tables using another instance of compose it is working.
Both compose instance are on Nov-2023 version and Replicate is on May-2024 version.
We have checked the permission and it is same for both instance and on databricks
Anyone faced similar issue and what was the resolution or workaround?
Thanks
-
Use only workday dates in variable
Hello, Is there a way we can only allow a variable to take weekday dates into it as a variable. What I am trying to do is basically below: Assuming ... Show MoreHello,
Is there a way we can only allow a variable to take weekday dates into it as a variable. What I am trying to do is basically below:
Assuming today is monday
Yesterday = Friday's date
Day before = Thursday's date
and then use these 2 variables, yesterday and day before into an SQL query in data load editor.
Appreciate the help!
Thanks,
Naman
-
How do people design loads with QFDs?
I do daily loads for some apps and I reload years of data at a time. I'm wondering if it makes sense to have previous 5 years of data in a QFD and the... Show MoreI do daily loads for some apps and I reload years of data at a time. I'm wondering if it makes sense to have previous 5 years of data in a QFD and then just reload current year and merge the two in the load script.
Point me to docs or if there is a better way to do this.
Qlik Sense on Prem
Thanks