Unlock a world of possibilities! Login now and discover the exclusive benefits awaiting you.
Looking for NPrinting articles that were created by other users/members? They were moved to their new home here: https://community.qlik.com/t5/Member-Articles/tkb-p/qlik-communityarticles.
They are all broken up by the appropriate label, you can search by a label and subscribe to labels.
Remember official Qlik Support articles for NPrinting can be found here: https://community.qlik.com/t5/Knowledge/tkb-p/qlik-support-knowledge-base
The NPrinting document board has been retired.
[Edited]
Dear All,
Qlik NPrinting May 2021 SR3, or newer, solves this issue and is available for download. Refer to the Release Notes https://community.qlik.com/t5/Release-Notes/Qlik-NPrinting-May-2021-SR3/ta-p/1872634#toc-hId-404311360.
Thanks to some conversations with Qlik NPrinting users we discovered that latest version v2109 of Excel and PowerPoint include in Microsoft Office 365 is not working correctly with Qlik NPrinting Designer. The R&D team is working right now to solve at the same time we suggest avoid installing Microsoft Excel and PowerPoint v2109 on the computers you use to edit Qlik NPrinting templates. Please note that Qlik NPrinting uses Microsoft Office only to edit report templates. Instead it is not used to generate reports.
To check the version of Microsoft Office you are using refer to https://support.microsoft.com/en-us/office/about-office-what-version-of-office-am-i-using-932788b8-a3ce-44bf-bb09-e334518b8b19
The update history for Microsoft 365 Apps is available at https://docs.microsoft.com/en-us/officeupdates/update-history-microsoft365-apps-by-date
You can also refer to the support article https://community.qlik.com/t5/Knowledge/Qlik-NPrinting-can-t-find-supported-reporting-template-editor/ta-p/1716228
SUGGESTED SOLUTION https://community.qlik.com/t5/Qlik-NPrinting-Documents/Qlik-NPrinting-Designer-cannot-find-a-supported-template-editor/ta-p/1861060
Word. PixelPerfect and HTML templates are not affected.
Best Regards,
Ruggero
Hello QlikView Users,
If you have any questions regarding installing QlikView please check out this post on the Qlik Support Updates Blog. There is a new step-by-step guide available that helps troubleshoot common issues that can occur. It also includes pictures to walk you through the installation. It is based on the Help site installation guide but hopefully the add-ons will help with the process.
Be sure to subscribe to the Qlik Support Updates Blog as well! The blog offers guidance from your peers on hot topics, announcement of new software releases for Qlik Sense, and updates regarding Qlik from the VP of Customer Success, Daniel Coullet.
Kind regards,
Qlik Digital Support
Afternoon,
I would appreciate some insight from those who have a better understanding of the internal processing of QlikView than I do.
The largest model in our environment is rebuilt everyday via a scheduled task in QMC. It takes around 40 minutes to rebuilt.
When I open that same model using QlikView Desktop and build the model it can take up to 1:30 depending on other processing that is been done on the server. I tested the rebuild and ran it when there was no other activity on the system (after working hours). It still takes over 1 hour to build.
I am puzzled as to why there is such difference in the build times using these 2 methods of rebuilding, as I assume they have the same processing engine. I am running my desktop rebuild on the same server that runs QMC. I log onto that server using a remote desktop connection. We also only have 1 instance of QMC running on that server. I have carried out this exercise as well as once of my colleagues and get the same results.
I am using May 2023 version of QlikView.
I would appreciate some insight into why this is happening.
Hi everyone,
I'm facing a performance issue with a Qlik Sense script that performs GeoAnalytics Intersects operations within a loop.
The script works perfectly for the first batch of data, completing the intersect operation quickly. However, when it moves to the second batch, the process either slows down dramatically or seems to stop completely.
Interestingly, when I run the same intersect operations on individual QVD files with separate LOAD
statements, each one finishes in less than 5 minutes. This suggests the issue is not with the core Intersects
function itself, but with how the script handles the data flow and memory across multiple iterations.
I suspect it may be related to memory management, as the script generates inline tables for each intersect operation, which might not be fully cleared between runs, or there might be an issue with how the GeoAnalytics connector handles repeated calls.
Could someone take a look at my code and help me identify what could be causing this? Any suggestions on how to improve the script's performance for subsequent batches would be greatly appreciated.
I tried to create 2 approaches and the same problem arises.
Thanks in advance!
THE FIRST CODE
// =========================================================================
// LOAD DATA FOR 'CONSERVATION_UNITS'
// =========================================================================
LET vTable_CU = 'CONSERVATION_UNITS';
LET vQVD_Path_CU = 'lib://Downloads/CGMA_QVDs/UNIDADES_CONSERVACAO.qvd';
$(vTable_CU):
LOAD
Geometry AS $(vTable_CU).GEOMETRY,
_autoIndex_ AS $(vTable_CU).AUTOID
FROM [$(vQVD_Path_CU)] (qvd);
// Check load
LET vCU_Count = NoOfRows('$(vTable_CU)');
TRACE $(vTable_CU) loaded with $(vCU_Count) records.;
// =========================================================================
// INITIAL SETUP
// =========================================================================
LET vTable_ATP = 'NEW_ATP_REQUEST';
LET vQVD_Path_ATP = 'lib://Downloads/CGMA_QVDs/REQUERIMENTO_ATP_NOVO.qvd';
LET vBatchSize = 20000;
// =========================================================================
// Load the entire table just to count records
// =========================================================================
$(vTable_ATP)_Full:
LOAD *
FROM [$(vQVD_Path_ATP)] (qvd);
LET vTotalRecords = NoOfRows('$(vTable_ATP)_Full');
TRACE Total records in $(vTable_ATP): $(vTotalRecords);
LET vNumBatches = Ceil($(vTotalRecords)/$(vBatchSize));
TRACE Number of batches of $(vBatchSize) records: $(vNumBatches);
// Remove full table loaded just for counting
DROP TABLE $(vTable_ATP)_Full;
// =========================================================================
// BATCH LOOP
// =========================================================================
FOR i = 0 TO $(vNumBatches)-1
LIB CONNECT TO 'GeoAnalyticsSEMA (sema-qliksense_administrator)';
LET vStart = $(i) * $(vBatchSize);
LET vEnd = $(vStart) + $(vBatchSize);
TRACE Loading batch $(i+1): records $(vStart+1) to Min($(vEnd),$(vTotalRecords));
// Load specific batch
$(vTable_ATP)_Batch_$(i+1):
LOAD
Geometry AS $(vTable_ATP).GEOMETRY,
_autoIndex_ AS $(vTable_ATP).AUTOID
FROM [$(vQVD_Path_ATP)] (qvd)
WHERE RecNo() > $(vStart) AND RecNo() <= $(vEnd);
// =========================================================================
// Special character mapping
// =========================================================================
[_inlineMap_]:
mapping LOAD * inline [
_char_, _utf_
"'", '\u0027'
'"', '\u0022'
"[", '\u005b'
"/", '\u002f'
"*", '\u002a'
";", '\u003b'
"}", '\u007d'
"{", '\u007b'
"`", '\u0060'
"´", '\u00b4'
" ", '\u0009'
];
// =========================================================================
// Check for mandatory fields in CONSERVATION_UNITS table
// =========================================================================
IF FieldNumber('UNIDADES_CONSERVACAO.AUTOID', 'UNIDADES_CONSERVACAO') = 0 THEN
call InvalidInlineData('The field UNIDADES_CONSERVACAO.AUTOID in UNIDADES_CONSERVACAO is not available');
END IF
IF FieldNumber('UNIDADES_CONSERVACAO.GEOMETRIA', 'UNIDADES_CONSERVACAO') = 0 THEN
call InvalidInlineData('The field UNIDADES_CONSERVACAO.GEOMETRIA in UNIDADES_CONSERVACAO is not available');
END IF
// =========================================================================
// Prepare CONSERVATION_UNITS inline table
// =========================================================================
Let [Dataset1InlineTable] = 'UNIDADES_CONSERVACAO.AUTOID' & Chr(9) & 'UNIDADES_CONSERVACAO.GEOMETRIA';
Let numRowsUC = NoOfRows('UNIDADES_CONSERVACAO');
Let chunkSize = 1000;
Let chunksUC = numRowsUC/chunkSize;
For nUC = 0 to chunksUC
Let chunkTextUC = '';
Let chunkUC = nUC*chunkSize;
For iUC = 0 To chunkSize-1
Let rowNrUC = chunkUC + iUC;
Exit for when rowNrUC >= numRowsUC;
Let rowUC = '';
For Each f In 'UNIDADES_CONSERVACAO.AUTOID','UNIDADES_CONSERVACAO.GEOMETRIA'
rowUC = rowUC & Chr(9) & MapSubString('_inlineMap_', Peek('$(f)', $(rowNrUC), 'UNIDADES_CONSERVACAO'));
Next
chunkTextUC = chunkTextUC & Chr(10) & Mid('$(rowUC)', 2);
Next
[Dataset1InlineTable] = [Dataset1InlineTable] & chunkTextUC;
Next
// =========================================================================
// Check for mandatory fields in the ATP batch table
// =========================================================================
IF FieldNumber('REQUERIMENTO_ATP_NOVO.AUTOID', '$(vTable_ATP)_Batch_$(i+1)') = 0 THEN
call InvalidInlineData('The field REQUERIMENTO_ATP_NOVO.AUTOID is not available');
END IF
IF FieldNumber('REQUERIMENTO_ATP_NOVO.GEOMETRIA', '$(vTable_ATP)_Batch_$(i+1)') = 0 THEN
call InvalidInlineData('The field REQUERIMENTO_ATP_NOVO.GEOMETRIA is not available');
END IF
// =========================================================================
// Prepare the batch inline table
// =========================================================================
Let [REQUERIMENTO_ATP_NOVOInlineTable] = 'REQUERIMENTO_ATP_NOVO.AUTOID' & Chr(9) & 'REQUERIMENTO_ATP_NOVO.GEOMETRIA';
Let numRowsATP = NoOfRows('$(vTable_ATP)_Batch_$(i+1)');
Let chunksATP = numRowsATP/chunkSize;
For nATP = 0 to chunksATP
Let chunkTextATP = '';
Let chunkATP = nATP*chunkSize;
For iATP = 0 To chunkSize-1
Let rowNrATP = chunkATP + iATP;
Exit for when rowNrATP >= numRowsATP;
Let rowATP = '';
For Each f In 'REQUERIMENTO_ATP_NOVO.AUTOID','REQUERIMENTO_ATP_NOVO.GEOMETRIA'
rowATP = rowATP & Chr(9) & MapSubString('_inlineMap_', Peek('$(f)', $(rowNrATP), '$(vTable_ATP)_Batch_$(i+1)'));
Next
chunkTextATP = chunkTextATP & Chr(10) & Mid('$(rowATP)', 2);
Next
[REQUERIMENTO_ATP_NOVOInlineTable] = [REQUERIMENTO_ATP_NOVOInlineTable] & chunkTextATP;
Next
// =========================================================================
// Execute GeoAnalytics Intersects operation
// =========================================================================
[IntersectsTable_Batch_$(i+1)]:
SQL SELECT [Dataset1_REQUERIMENTO_ATP_NOVO_RelationKey],
[UNIDADES_CONSERVACAO.AUTOID],
[REQUERIMENTO_ATP_NOVO.AUTOID],
[Dataset1_RelativeOverlap],
[REQUERIMENTO_ATP_NOVO_RelativeOverlap]
FROM Intersects(intersectsCount='0', dataset1='Dataset1', dataset2='REQUERIMENTO_ATP_NOVO')
DATASOURCE Dataset1 INLINE tableName='UNIDADES_CONSERVACAO', tableFields='UNIDADES_CONSERVACAO.AUTOID,UNIDADES_CONSERVACAO.GEOMETRIA', geometryType='POLYGON', loadDistinct='NO', suffix='', crs='Auto' {$(Dataset1InlineTable)}
DATASOURCE REQUERIMENTO_ATP_NOVO INLINE tableName='REQUERIMENTO_ATP_NOVO', tableFields='REQUERIMENTO_ATP_NOVO.AUTOID,REQUERIMENTO_ATP_NOVO.GEOMETRIA', geometryType='POLYGON', loadDistinct='NO', suffix='', crs='Auto' {$(REQUERIMENTO_ATP_NOVOInlineTable)}
;
//tag field [Dataset1_REQUERIMENTO_ATP_NOVO_RelationKey] with '$primarykey';
// Load the destination QVD if it already exists, to perform concatenation
LET vPathQVD = 'lib://Downloads/CGMA/Processamento/intersect_2.qvd';
IF NOT IsNull(FileTime('$(vPathQVD)')) THEN
CONCATENATE([IntersectsTable_TEMP])
LOAD * FROM [$(vPathQVD)] (qvd);
END IF;
[Dataset1InlineTable] = '';
[REQUERIMENTO_ATP_NOVOInlineTable] = '';
// Drop batch table
DROP TABLE $(vTable_ATP)_Batch_$(i+1);
// Drop temporary loop variables
LET Dataset1InlineTable = '';
LET REQUERIMENTO_ATP_NOVOInlineTable = '';
LET chunkTextUC = '';
LET chunkTextATP = '';
LET rowUC = '';
LET rowATP = '';
LET rowNrUC = '';
LET rowNrATP = '';
LET chunkUC = '';
LET chunkATP = '';
LET nUC = '';
LET iUC = '';
LET nATP = '';
LET iATP = '';
// Drop batch variables
LET vStart = '';
LET vEnd = '';
TRACE Memory cleared after batch $(i+1);
DISCONNECT;
SLEEP(10000);
NEXT i
// ==========================================
// FINAL CLEANUP OF ALL POSSIBLE VARIABLES
// ==========================================
LET vTable_ATP = '';
LET vQVD_Path_ATP = '';
LET vBatchSize = '';
LET vTotalRecords = '';
LET vNumBatches = '';
TRACE Final memory cleanup completed.;
TRACE Full load of $(vTable_ATP) in $(vNumBatches) batches;
-----------------------------
THE SECCOND CODE
// Character mapping
[_inlineMap_]:
mapping LOAD * inline [
_char_, _utf
"'", '\u0027'
'"', '\u0022'
"[", '\u005b'
"/", '\u002f'
"*", '\u002a'
";", '\u003b'
"}", '\u007d'
"{", '\u007b'
"`", '\u0060'
"´", '\u00b4'
" ", '\u0009'
];
// --- SUB-ROUTINE DEFINITION: IntersectArquivo (IntersectFile) ---
SUB IntersectFile(vPath, vFileName, vIntersectingTable, vIntersectingTable_ID, vIntersectingTable_Geom)
LIB CONNECT TO 'GeoAnalyticsSEMA (sema-qliksense_administrator)';
// TRACE for debugging
TRACE PATH=$(vPath);
TRACE INTERSECTEDFILE=$(vFileName);
TRACE INTERSECTOR=$(vIntersectingTable);
TRACE INTERSECTOR_ID=$(vIntersectingTable_ID);
TRACE INTERSECTOR_GEOMETRY=$(vIntersectingTable_Geom);
// Connect to the GeoAnalytics database
LIB CONNECT TO 'GeoAnalyticsSEMA (sema-qliksense_administrator)';
// Extract the base file name to use as the table name
LET vBaseName = SubField(SubField('$(vFileName)', '/', -1), '.', 1);
// The intersecting table (smaller and complete table)
// Loaded only once
INTERSECTING_TABLE:
NoConcatenate
LOAD
$(vIntersectingTable_Geom) AS INTERSECTING_TABLE.GEOMETRY,
$(vIntersectingTable_ID) AS INTERSECTING_TABLE.AUTOID
RESIDENT $(vIntersectingTable);
// Generates the inline table for the INTERSECTING_TABLE
// The INTERSECTING_TABLE is generated only once, outside the loop
LET [INTERSECTING_TABLEInlineTable] = 'INTERSECTING_TABLE.AUTOID' & Chr(9) & 'INTERSECTING_TABLE.GEOMETRY';
Let numRows_Intersecting = NoOfRows('INTERSECTING_TABLE');
For i = 0 To numRows_Intersecting - 1
Let row = '';
For Each f In 'INTERSECTING_TABLE.AUTOID', 'INTERSECTING_TABLE.GEOMETRY'
row = row & Chr(9) & MapSubString('_inlineMap_', Peek('$(f)', $(i), 'INTERSECTING_TABLE'));
Next
[INTERSECTING_TABLEInlineTable] = [INTERSECTING_TABLEInlineTable] & Chr(10) & Mid('$(row)', 2);
Next
// Load the QVD file for the current chunk
// This code is the body of the loop that will iterate over the files.
// It will be executed for each file passed to the sub-routine.
// The table to be intersected (the current chunk)
INTERSECTED_TABLE:
LOAD
$(vBaseName).GEOMETRY as INTERSECTED_TABLE.GEOMETRY,
$(vBaseName).AUTOID as INTERSECTED_TABLE.AUTOID
FROM [$(vFileName)] (qvd);
// Generates the inline table for the current chunk of the INTERSECTED_TABLE
LET [INTERSECTED_TABLEInlineTable] = 'INTERSECTED_TABLE.AUTOID' & Chr(9) & 'INTERSECTED_TABLE.GEOMETRY';
Let numRows_Intersected = NoOfRows('INTERSECTED_TABLE');
For i = 0 To numRows_Intersected - 1
Let row = '';
For Each f In 'INTERSECTED_TABLE.AUTOID', 'INTERSECTED_TABLE.GEOMETRY'
row = row & Chr(9) & MapSubString('_inlineMap_', Peek('$(f)', $(i), 'INTERSECTED_TABLE'));
Next
[INTERSECTED_TABLEInlineTable] = [INTERSECTED_TABLEInlineTable] & Chr(10) & Mid('$(row)', 2);
Next
// Connection and Intersect operation with the inline tables
[IntersectsTable_TEMP]:
SQL SELECT [INTERSECTING_TABLE_INTERSECTED_RelationKey], [INTERSECTING_TABLE.AUTOID], [INTERSECTED_TABLE.AUTOID], [INTERSECTING_TABLE_RelativeOverlap], [INTERSECTED_TABLE_RelativeOverlap] FROM Intersects(intersectsCount='0', dataset1='INTERSECTING_TABLE', dataset2='INTERSECTED_TABLE')
DATASOURCE INTERSECTING_TABLE INLINE tableName='INTERSECTING_TABLE', tableFields='INTERSECTING_TABLE.AUTOID,INTERSECTING_TABLE.GEOMETRY', geometryType='POLYGON', loadDistinct='NO', suffix='', crs='Auto' {$(INTERSECTING_TABLEInlineTable)}
DATASOURCE INTERSECTED_TABLE INLINE tableName='INTERSECTED_TABLE', tableFields='INTERSECTED_TABLE.AUTOID,INTERSECTED_TABLE.GEOMETRY', geometryType='POLYGON', loadDistinct='NO', suffix='', crs='Auto' {$(INTERSECTED_TABLEInlineTable)}
;
// Load the destination QVD if it already exists, to perform concatenation
LET vPathQVD = 'lib://Downloads/CGMA/Processamento/intersect.qvd';
IF NOT IsNull(FileTime('$(vPathQVD)')) THEN
CONCATENATE([IntersectsTable_TEMP])
LOAD * FROM [$(vPathQVD)] (qvd);
END IF;
// Save the final table
STORE [IntersectsTable_TEMP] INTO 'lib://Downloads/CGMA/Processamento/intersect.qvd' (qvd);
// Clean up temporary tables and variables
DROP TABLES [IntersectsTable_TEMP], INTERSECTED_TABLE, INTERSECTING_TABLE;
// Clear the inline table for the next use
LET [INTERSECTED_TABLEInlineTable] = '';
LET [INTERSECTING_TABLEInlineTable] = '';
// ==========================
// FINAL MEMORY CLEANUP
// ==========================
// Clear the variables used (assign empty string)
LET INTERSECTING_TABLEInlineTable = '';
LET INTERSECTED_TABLEInlineTable = '';
LET vBaseName = '';
LET numRows_Intersecting = '';
LET numRows_Intersected = '';
LET row = '';
LET f = '';
LET i = '';
LET vPath = '';
LET vFileName = '';
LET vIntersectingTable = '';
LET vIntersectingTable_ID = '';
LET vIntersectingTable_Geom = '';
LET vPathQVD = '';
DISCONNECT;
SLEEP(2000);
END SUB;
Hi all,
As explained in the forum link below, it is not possible to use a dynamic URL for the WMS layer in the native Map object in Qlik Cloud:
Since GeoOperations is at the script-level, I am not sure what option that leaves me. Do I need the GeoAnalytics extension to solve this problem?
Hi everyone! I'm trying to find a solution that thru my research seems to be a thing for awhile for some people? I haven't found a solution that works so far. Has anyone come across the issue shown in the image attached? How did you fix it?
Hello,
i need to build a report from a table.
in this table i have a service hierarchy built from 3 fields.
i want to filitre the table in one sheet of the report on all values on the third level of the service hierarchy, which is the department. How can i select all values of the department in the filtre of the object , i have alot of values for this field ?
thanks alot
NPrinting Feb 2024
Hello All,
Has anyone succeeded with Integrating Qlikview with Okta.
I been thru few white pages & came to know that we need to customize a gateway for SSO (Single sign on) or use a 3rd party gateway tools to achieve this, But is there anyway to integrate Qlikview directly with Okta ? If so how.
Else, what 3rd party gateways does support better integration between Qlikview & Okta ?
Any suggestions or input ?
Thank you all in advance.
Thanks
Brad.
Hi
I am creating a table in pixel perfect. The measure column is showing Dual(sum and Count). Pixel perfect was not supporting dual function so i added sum formula field from Qlik and created count as formula in pixel perfect and inserted cell to show the count formula. When sum =0 it is not showing in total. when total count = 0 it is showing on Total group header, and we want to hide/exclude 0.
I tried formatting rule but it is giving error because my expression has set analysis
count({<OD_ReporingMonthEnd={'$(=$(vEndDate))'}>}OD_USD_Equivalent)
1. Can you please guide how to hide/exclude 0 from total?
attaching screenshot as example from the output.
2. when there is all is zero or null for a selection, how to show 'No Data' message instead of table ?
I have connected to facebook to retrieve data using REST connector. But the access token is valid only till I am logged in with facebook. As soon as I log out, it becomes invalid. When I try to reload the data it gives an error that user is logged out. Could you please help me on this.
Hello Qlik community,
I'm having an issue on how to get ScriptErrorDetail. I know that it dissappear when i don't get it right after the error and here is my case :
I'm loading data from a file as shown below :
---------------------
$(vDIM_SPECIALITES_INFRA) :
LOAD AGENT_EMPLOI_REPERE,
INFRA_EMPLOI_REPERE_LIB,
AGENT_EMPLOI_REPERE_CODE_LIB,
INFRA_MATRICE_BUDGETAIRE,
INFRA_SPECIALITE_LIB,
INFRA_PRODUCTIONe
FROM [$(vFileSourcePath)$(vDIM_SPECIALITES_INFRA)] (qvd);
----------------------
After that i'm having this condition :
---------------------
if(ScriptError <> 0) then
// some code to store loadingStart/end, source file name, number of loaded rows AND the ErrorScriptDetails.
ELSE
// some other code
ENDIF;
--------------------
I need to keep the if condition to make condition if there is an error during loading and i also want to get the $(ErrorScriptDetails)
to store it into a log file.
Note that i'm having about 31 loads like the code shown above.
If anyone could helps it would be awesome.
ManyThanks
hi i am having requirement to show every month end sales.Below is the logic am used but it gives Zero.
if(date(monthend([Reported Date]),'M/D/YYYY') =date( [Reported Date],'M/D/YYYY'), 1,0) as MonthEnd_Flag
Hi
Does anyone know ........ I have qlikview v11 set up on a server and want to pass login info as part of the url.
The setup has users set up as custom directory under the service connectors. And login works fine when the user wants to log in on the form. However I want to pass a url that logs the user in automatically if I know what their username and password is
Therefore would expect a url that looks like :
http://[Server ip address]/qlikview/login2.htm?username=Fred&pass=Security
does anyone know if this is possible ?????
Can you recommend solution for this:
QlikView Snowflake dataload failing due to the following error:-
ErrorMsg: [Snowflake][Snowflake] (25) Result download worker error: Worker error: [Snowflake][Snowflake] (4) REST request for URL https://sfc-aus-ds1-6-customer-stage.s3.ap-southeast-2.amazonaws.com
Hi Community Forum,
Hope you can help with this one please?
I have what I thought was a simple query, but after scouring the community page, no-one else seems to be asking this one.
I work in a business where we want to calculate the exact number of working days between a 'StartDateTime' and an 'EndDateTime', taking holidays into account. However because this is driving KPI reporting, we want to be exact. Networkdays isn't doing it for me....here are some details of my experience.
Here is how I currently approached it. Due to the requirement to manage weekdays and holidays, I started with the following function:
NetWorkDays(date_ordered, DateCompleted, $(vStatHols)) as WorkDaysToComplete
....however of course, I only get a whole number returned using this function.
Example - one order looked like this
date_ordered = 5/01/2015 11:51:36 AM
DateCompleted = 7/01/2015 10:28:13 AM
Networkdays = 3
The actual correctly calculated working days is 1.942 days.
The difference from a KPI performance perspective is quite significant (1.058 days) and it magnifies once you spread that logic across a year of order processing and customer reporting.
I will also need the function to round up and down the exact result.........so in the above, it would of course be 2 days.
If anyone can assist me in how I can achieve this, I would really appreciate it. I don't know what other function will allow the calculation of working days and holidays other than networkdays. Hoping there is an easy solution to get the exact decimal.
Thanks
Giles
a new folder and a category has been created on the QlikView server Client managed version May 2024 and users cant' see the category on the QlikView AccessPoint. Any idea why?
Folder permissions are set ok:
Thank you
Hello,
Is it possible to hide this button for all users?
Our users access apps via a portal, and we do not want them to use Access Point. Thanks
Hi,
I have created different reports, each has the same table, the same width but different height.
In office outlook 365, the images have the same width.
But in Outlook classic, the images have different width.
Can it be resolved?
Thanks in advance!
Does anyone have and is willing to share a Qlik Model/Demo for Plant Maintenance that primarily uses the Pronto ERP system for its source data?
I have looked through the Qlik demo site for something suitable.
I just need to demonstrate capability to a mining company's Plant Maintenance (fixed plant) department, so even screenshots would help.