Unlock a world of possibilities! Login now and discover the exclusive benefits awaiting you.
Hi Qlik Support,
When using an UPSERT error handling policy (as a result of enabling the "Apply changes using SQL MERGE" option) in a Replicate task which writes to Snowflake, is Replicate able to correctly prevent duplicate records from being written to the Snowflake target even though Snowflake is known to not enforce the uniqueness of primary keys.
Apologies if this question has already been addressed in a different community post.
Thanks,
Nak
Hi!, reading documentation about Qlik Web storage provider 'Google Cloud Storage', it says that you only can put data into a bucket in Qlik SAAS, are there any plans to have this in Qlik Sense Enterprise?
You can store table data into your Google Cloud Storage bucket in the Data load editor. You can either create a new load script or edit the existing script.
Hi Team ,
we are facing below issue. But the task is running. and slowly changes are getting captured.
source is saphana
target : Azure sql db
00028240: 2024-04-22T01:28:14 [SOURCE_CAPTURE ]T: RetCode: SQL_ERROR SqlState: S1000 NativeError: 146 Message: [SAP AG][LIBODBCHDB DLL][HDBODBC] General error;146 Resource busy and NOWAIT specified: (lock table failed on vid=3, owner's lockMode: EXCLUSIVE, transID: 24627011275) [1022502] (ar_odbc_stmt.c:2816)
00028240: 2024-04-22T01:28:15 [SOURCE_CAPTURE ]T: Failed (retcode -1) to execute statement: 'LOCK TABLE "XXATT"."attrep_cdc_log" IN EXCLUSIVE MODE NOWAIT' [1022502] (ar_odbc_stmt.c:2810)
could you please let us know what could be done.
Thanks.
Source: On-Prem SQL server
Target: Azure SQL server
We are copying 196 tables from source to target. For whole night it was running fine. Full load and Change processing was perfect over a night.
Suddenly we encountered this error.
I've simplified my data model in the follow table with some sample data.
To give some context, the data exists out of 3 concatenated data sets:
1. Email
2. Email + Email Content Block
3. Email + Email Content Block + Product
An email can have multiple content blocks
Content blocks can contain multiple products
When i select email 1, i get a nice overview with the included content blocks and Products.
However, i also want to do this vice versa. Hence, when i select CTN i want to see the linked content blocks and emails and the respective #Send value.
See below for the desired output:
Select Brush, i want to see the sends from row 4, because this email (1) does include this product.
I also want to see the sends from row 3, because this content block (A) also included this product.
And i also want to see, but that will anyway work, the sends from row 1, because those are the sends for this product.
Same method when i filter in the Cluster.
And same when is filtered on the Email Content Block.
Hi I had a quick question about full load, let's say if I started my task and its running for 2 hours, meanwhile any new changes that are coming on that DB2 table - will those also be replicated?
Thanks
Hola, estoy intentando crear un rango cada 15 minutos, donde recibo una hora y valido si es mayor a desde y menor a hasta
la función definitivamente no está funcionando
a continuación un ejemplo de como debiese funcionar
Hora_RangoHora | Desde | Hasta | if(Hora_RangoHora>=Desde,1,0) | if(Hora_RangoHora<=Hasta,1,0) |
00:00:48 | 00:00:00 | 00:14:59 | 1 | 1 |
00:00:48 | 00:15:00 | 00:29:59 | 0 | 1 |
00:00:48 | 00:30:00 | 00:44:59 | 0 | 1 |
00:00:48 | 00:45:00 | 00:59:59 | 0 | 1 |
00:00:48 | 01:00:00 | 01:14:59 | 0 | 1 |
00:00:48 | 01:15:00 | 01:29:59 | 0 | 1 |
00:00:48 | 01:30:00 | 01:44:59 | 0 | 1 |
00:00:48 | 01:45:00 | 01:59:59 | 0 | 1 |
00:00:48 | 02:00:00 | 02:14:59 | 0 | 1 |
00:00:48 | 02:15:00 | 02:29:59 | 0 | 1 |
00:00:48 | 02:30:00 | 02:44:59 | 0 | 1 |
00:00:48 | 02:45:00 | 02:59:59 | 0 | 1 |
00:00:48 | 03:00:00 | 03:14:59 | 0 | 1 |
00:00:48 | 03:15:00 | 03:29:59 | 0 | 1 |
00:00:48 | 03:30:00 | 03:44:59 | 0 | 1 |
00:00:48 | 03:45:00 | 03:59:59 | 0 | 1 |
00:00:48 | 04:00:00 | 04:14:59 | 0 | 1 |
00:00:48 | 04:15:00 | 04:29:59 | 0 | 1 |
00:00:48 | 04:30:00 | 04:44:59 | 0 | 1 |
00:00:48 | 04:45:00 | 04:59:59 | 0 | 1 |
00:00:48 | 05:00:00 | 05:14:59 | 0 | 1 |
00:00:48 | 05:15:00 | 05:29:59 | 0 | 1 |
00:00:48 | 05:30:00 | 05:44:59 | 0 | 1 |
00:00:48 | 05:45:00 | 05:59:59 | 0 | 1 |
00:00:48 | 06:00:00 | 06:14:59 | 0 | 1 |
00:00:48 | 06:15:00 | 06:29:59 | 0 | 1 |
00:00:48 | 06:30:00 | 06:44:59 | 0 | 1 |
00:00:48 | 06:45:00 | 06:59:59 | 0 | 1 |
00:00:48 | 07:00:00 | 07:14:59 | 0 | 1 |
00:00:48 | 07:15:00 | 07:29:59 | 0 | 1 |
00:00:48 | 07:30:00 | 07:44:59 | 0 | 1 |
00:00:48 | 07:45:00 | 07:59:59 | 0 | 1 |
00:00:48 | 08:00:00 | 08:14:59 | 0 | 1 |
00:00:48 | 08:15:00 | 08:29:59 | 0 | 1 |
00:00:48 | 08:30:00 | 08:44:59 | 0 | 1 |
00:00:48 | 08:45:00 | 08:59:59 | 0 | 1 |
00:00:48 | 09:00:00 | 09:14:59 | 0 | 1 |
00:00:48 | 09:15:00 | 09:29:59 | 0 | 1 |
00:00:48 | 09:30:00 | 09:44:59 | 0 | 1 |
00:00:48 | 09:45:00 | 09:59:59 | 0 | 1 |
00:00:48 | 10:00:00 | 10:14:59 | 0 | 1 |
00:00:48 | 10:15:00 | 10:29:59 | 0 | 1 |
00:00:48 | 10:30:00 | 10:44:59 | 0 | 1 |
00:00:48 | 10:45:00 | 10:59:59 | 0 | 1 |
00:00:48 | 11:00:00 | 11:14:59 | 0 | 1 |
00:00:48 | 11:15:00 | 11:29:59 | 0 | 1 |
00:00:48 | 11:30:00 | 11:44:59 | 0 | 1 |
00:00:48 | 11:45:00 | 11:59:59 | 0 | 1 |
00:00:48 | 12:00:00 | 12:14:59 | 0 | 1 |
00:00:48 | 12:15:00 | 12:29:59 | 0 | 1 |
00:00:48 | 12:30:00 | 12:44:59 | 0 | 1 |
00:00:48 | 12:45:00 | 12:59:59 | 0 | 1 |
00:00:48 | 13:00:00 | 13:14:59 | 0 | 1 |
00:00:48 | 13:15:00 | 13:29:59 | 0 | 1 |
00:00:48 | 13:30:00 | 13:44:59 | 0 | 1 |
00:00:48 | 13:45:00 | 13:59:59 | 0 | 1 |
00:00:48 | 14:00:00 | 14:14:59 | 0 | 1 |
00:00:48 | 14:15:00 | 14:29:59 | 0 | 1 |
00:00:48 | 14:30:00 | 14:44:59 | 0 | 1 |
00:00:48 | 14:45:00 | 14:59:59 | 0 | 1 |
00:00:48 | 15:00:00 | 15:14:59 | 0 | 1 |
00:00:48 | 15:15:00 | 15:29:59 | 0 | 1 |
00:00:48 | 15:30:00 | 15:44:59 | 0 | 1 |
00:00:48 | 15:45:00 | 15:59:59 | 0 | 1 |
00:00:48 | 16:00:00 | 16:14:59 | 0 | 1 |
00:00:48 | 16:15:00 | 16:29:59 | 0 | 1 |
00:00:48 | 16:30:00 | 16:44:59 | 0 | 1 |
00:00:48 | 16:45:00 | 16:59:59 | 0 | 1 |
00:00:48 | 17:00:00 | 17:14:59 | 0 | 1 |
00:00:48 | 17:15:00 | 17:29:59 | 0 | 1 |
00:00:48 | 17:30:00 | 17:44:59 | 0 | 1 |
00:00:48 | 17:45:00 | 17:59:59 | 0 | 1 |
00:00:48 | 18:00:00 | 18:14:59 | 0 | 1 |
00:00:48 | 18:15:00 | 18:29:59 | 0 | 1 |
00:00:48 | 18:30:00 | 18:44:59 | 0 | 1 |
00:00:48 | 18:45:00 | 18:59:59 | 0 | 1 |
00:00:48 | 19:00:00 | 19:14:59 | 0 | 1 |
00:00:48 | 19:15:00 | 19:29:59 | 0 | 1 |
00:00:48 | 19:30:00 | 19:44:59 | 0 | 1 |
00:00:48 | 19:45:00 | 19:59:59 | 0 | 1 |
00:00:48 | 20:00:00 | 20:14:59 | 0 | 1 |
00:00:48 | 20:15:00 | 20:29:59 | 0 | 1 |
00:00:48 | 20:30:00 | 20:44:59 | 0 | 1 |
00:00:48 | 20:45:00 | 20:59:59 | 0 | 1 |
00:00:48 | 21:00:00 | 21:14:59 | 0 | 1 |
00:00:48 | 21:15:00 | 21:29:59 | 0 | 1 |
00:00:48 | 21:30:00 | 21:44:59 | 0 | 1 |
00:00:48 | 21:45:00 | 21:59:59 | 0 | 1 |
00:00:48 | 22:00:00 | 22:14:59 | 0 | 1 |
00:00:48 | 22:15:00 | 22:29:59 | 0 | 1 |
00:00:48 | 22:30:00 | 22:44:59 | 0 | 1 |
00:00:48 | 22:45:00 | 22:59:59 | 0 | 1 |
00:00:48 | 23:00:00 | 23:14:59 | 0 | 1 |
00:00:48 | 23:15:00 | 23:29:59 | 0 | 1 |
00:00:48 | 23:30:00 | 23:44:59 | 0 | 1 |
00:00:48 | 23:45:00 | 23:59:59 | 0 | 1 |
código de carga
I'm new to Qlik sense -
I'm a new Qlik Sense business analysis and I'm building a Qlik Sense app for analysis to replace excel VBA code to pull historical purchases for one or more items - IT will no longer support the excel VBA :(. My organization is large and the total number of records in my PO data warehouse exceeds over well over 100M records. Requirement is that I use Qlik Sense to pull the same data as the excel VBA code.
My only search criteria is item number. There is no limit to time, geography or entity.
When i build the SQL to the data warehouse, I write the SQL in the Data Load Editor and it pulls all PO data... over 100M+ records. Couple of challenges:
1/ When Qlik Sense users attempt to use the app, they often run into Out of Memory errors.
2/ When the App works, it takes more than 20min for the sheet to display and users can enter item numbers.
Questions:
1/ is this expected behavior for Qlik Sense to frequently result in errors for such large data sets? I filtered the data sets to return about 50M records and it still results in error, sometimes.
2/ Is there a way in Qlik Sense to allow users to enter the item numbers BEFORE the initial data load and have only the required item number PO history be pulled?
3/ I tried building a QVD file but it takes 20 min to load the data then another 30 minutes to index. Waiting nearly an hour to run a report doesn't seem to make sense.
What other options, if any, do i have in Qlik Sense to limit the initial data load?
Hi
I've tried everything but cant figure out how to insert a new line character... I would like to show these three fields below each other. nl doesn't work...
=Maxstring(If([L1] = VarL1Out and Isnull([L2]),[HoverText])) &'. Actual: '
&Maxstring(If([L1] = VarL1Out and Isnull([L2]),[Actual])) & '. Target: '
&Maxstring(If([L1] = VarL1Out and Isnull([L2]),[Target]))
{/code]
Currently the help text looks like this:
My hovertext. Actual: 50.4. Target: 55.5
I want to be like:
My hovertext.
Actual: 50.4.
Target: 55.5</body>
I have sections in my data load script.
Currently, I have two sections, and they are pulling data from entirely different tables. However, they appeared to be linked. When I go to the pivot, it pulls data based on a selection of the other sections.
How can I have each data pulled from their respective sections?
Also, it is slowing the application.
Wondering about Qlik Talend Data Integration Sessions? There are 11, in addition to all of the Data & Analytics. So meet us in Orlando, June 3 -5.
Browse our helpful how-to's to learn more about navigating Qlik Community and updating your profile.
Join us on April 24th at 10 AM ET for the next Do More with Qlik webinar focusing on Qlik’s Data Integration & Quality solutions.
Your journey awaits! Join us by Logging in and let the adventure begin.
Qlik enables a frictionless migration to AWS cloud by Empresas SB, a group of Chilean health and beauty retail companies employing 10,000 people with 600 points of sale.
Qlik Luminary Stephanie Robinson of JBS USA, the US arm of the global food company employing 70,000 in the US, and over 270,000 people worldwide.
Join one of our Location and Language groups. Find one that suits you today!
A private group is for healthcare organizations, partners, and Qlik healthcare staff to collaborate and share insights..
Qlik Communityの日本語のグループです。 Qlik製品に関する日本語資料のダウンロードや質問を日本語で投稿することができます。
Welcome to the group for Brazil users. .All discussions will be in Portuguese.
Hear from your Community team as they tell you about updates to the Qlik Community Platform and more!