Skip to main content

QnA with Qlik: Qlik Replicate Tips

cancel
Showing results for 
Search instead for 
Did you mean: 
Troy_Raney
Digital Support
Digital Support

QnA with Qlik: Qlik Replicate Tips

Last Update:

May 22, 2023 6:47:01 AM

Updated By:

Troy_Raney

Created date:

May 3, 2023 8:58:04 AM

Troy_Raney_0-1683118505797.gif

Environment

  • Qlik Replicate

Transcript

Hello everyone, and welcome to another session of Q&A with Qlik webinar series.
We're excited to have all of you all here today. Today's topic or what we'll be covering is
Qlik Replicate Tips, which will hopefully help those who are new to the topic or existing users looking for tips on the subject.
This is a live question & answer session. What we would ask you to do is put those
questions in the QA panel below and we will try to get to them all.
Before we get started, there's just a few things we want to go over. Our link for the survey is going to pop up at the end.
It is a QR code. That QR code is just a survey less than two minutes just telling us
how did you like this session and what did you think about it? We do have an upcoming session next month.
Here is a QR code where you can actually register for that. Scan it with your phone, it will take you to the page.
You'll register for the next upcoming Q&A with Qlik.
Again, today we are covering the topic, Qlik Replicate Tips
my name is Emmanuel Herndon. I am one half of the Q&A co-host of team here at Qlik, and I am a Digital Customer
Success Specialist whose focus is on helping customers from trials
to renewals and in between, enhancing their customer journey
with videos such as this one and webinars. I am not doing this alone, as mentioned before, my colleague Troy is helping me.
Will you say hello, Troy? Hi, everybody. My name is Troy Rainy and happy to work with Manny on this one.
And we got a great panel for you today. Let's go ahead and introduce them.
Kelly, want to say hi? Hey, everyone. Thanks for joining. My name is Kelly Hobson and I am a Support
Engineer with the Data Integration and Auto ML group. Thanks. Allen?
Hello, I'm Allen Wang. I'm also part of the support group for the Qlik Data Integration team.
Great. Swathi? Hi, everyone. This is Swathi. I also belongs to support team and I'll be working on Qlik Replicate tool.
Thank you. And Bill? Hi, this is Bill Steinagle.
I work in support for QDI as well. Thanks.
Looks like a young buck there in that picture. Yeah, that's an old LinkedIn picture from IBM days.
All right. Well, we've already got a couple of questions coming in.
We're going to start with the first question.
What should be checked prior to upgrading Qlik Replicate?
What should be checked prior to upgrading Qlik Replicate?
Swathi, you want to take that one? Yeah, I'll go with this. Usually, before upgrading, it is always recommended to check
the accumulated release notes to understand about the fixes that was done on that particular version, and also the user guide for new changes
because there will be limitations tab where we can go and check
the limitations that will be applicable before upgrading.
Also, usually, I think in user guide, we have a process like how to upgrade, like taking the data folder backup and also the entire server repository backup.
These are like we should consider them prior upgrading to Qlik Replicate.
Great. Thanks. I'm going to add to that as well. Before you check any endpoints,
you also want to check the driver versions and the database versions. Make sure the version of Replicate that you upgrade to will support it.
If there's anything that's depreciated in the current version of Replicate, you may end up with tasks that will not work.
Okay. I see a couple of questions are getting addressed, in written for them.
Let's try and bring them up so everybody can hear the question. There was one, can Qlik Replicate rights to QVDs as an endpoint?
Can Qlik Replicate rights to QVDs as an endpoint?
We're talking about Qlik Replicate On-prem.
Not using the Qlik Replicate, we cannot, but currently we are having this QCDA.
Using that QCDA only, we can write to QVDS as an endpoint. Side note there, if they write to a file target,
creates the CSV file, they can load that in a few days, I believe. There'll be the average step
in the legacy Replicate and not the QDCI. You go on the supply section.
Thank you, Bill, for that. There was a question that was answered in written form. Just want to go over it.
Any documentation on Python API?
Any documentation on Python API? I think Kelly gave a link, which I will share.
But Kelly, do you want to go into more detail? Sure. For the API tools and engagement, we point people to the enterprise manager
documentation, and it has the full set of lists of the API documentation and commands you can use.
I drop that directly to Mike, who asked the question, and also looks like we're going to share this documentation as well via this link.
Yeah, that's what that looks like. Just wanted to bring it up.
Is it possible to join tables within a Qlik Replicate task?
Thank you. All right, next question. Is it possible to join tables within a Qlik Replicate task?
If you need to transform replicated tables to a different data model in a subsequent processing step,
how can you access only the changes without processing the entire database? Yeah, I can go with and you can add if I miss anything.
Actually with replicate task, no. But as I stated, we are having the QCDI tool.
Using that, we can in QCDI actually, we can have SQL transformations like DB tool.
Using this QCDI, we can. Can I add to that Swathi?
Yeah, Bill. With Replicate currently, you can do the store changes option.
Then that create just basic changes for those certain tables, you would then be able to do join the data
model, like with the composed product or other other product on the CCG scale.
Okay, thank you. We do have another question from participants.
Are there any future Replicate plans to support Oracle autonomous as a source endpoint?
Are there any future Replicate plans to support Oracle autonomous?
Yeah. Sorry about that. As a source endpoint.
I think we should take that back to the lab and follow up with product management.
Yeah. Unfortunately, we don't have anybody that can speak to the roadmap today. Right. I
wanted to just point out that there's on the Qlik community, there's our ideation forum and our Product Management team keeps a close eye on that.
So if it's not happening, it's a good idea to post that suggestion there and get some colleagues in the community to upload it.
And I'm sure it's something that people will pay a lot more attention to when customers
are in that forum asking for it. Great.
Is single sign on or SAM authentication on the roadmap for Qlik Replicate on-prem?
All right, next question. Is single sign on or SAM authentication on the roadmap for Qlik Replicate on-prem?
Oh, another roadmap question. Anybody have any insight to that one? We have to check for Qlik Replicate, but it is available on QE.
But if it is for Qlik Replicate, yeah, we have to check that. It's available on what was that?
Qlik Enterprise Manager. Enterprise Manager. Thank you. Okay, thank you.
We have another question, it's broad, but I think it's good to ask.
What are some of the new features that we should know?
What are some of the new features that we should know about any good documentation that can be shared?
Yeah, as I stated earlier, before upgrading, you have to check a couple of things.
Same way for any new features, we have to check for the release notes for that particular version.
Usually, we'll be having that link. I can share that link in the chat, release notes link.
Now, if we go to the user guide, down below, we'll be having an option to select the release notes for that particular version.
The release notes there? Yeah. I'll push that so everybody can see it.
Yes, I can. Okay. Thank you. You're welcome.
Another quick note, if you scroll up on. This documentation on the left hand side,
there's a section called What's New for each new release.
Yeah, it's good to keep an eye on that. Thank you for pointing that out.
Okay. Next question.
It's a bit of a thread here. Prerequisites are incomplete.
Replicate prerequisites are incomplete. Don’t see libraries like Enum 43 and EM Client?
We don't see any libraries like enum 34. Also, libraries like a EM Client.
I'm not quite sure if I followed. Does anyone else?
Yeah. Want to address that? Yeah. What specific prerequisite is it being asked?
All the prerequisites for replicate are defined in the user guide and it should be
the O2BC requirements for QEM enterprise manager.
I'm not sure if that's the question. We wouldn't have a library enum 34 that I'm aware of.
If you can provide an example, I'd be happy to. Am client, I believe,
referring to the attend the enterprise manager, which doesn't require any client.
That's just the installation of the problem. Okay, next question.
Will APIs support, adding and deleting endpoints soon?
Will APIs support adding and deleting endpoints soon?
Just now supporting editing existing. So looking for an API that will allow you to add and delete endpoints.
So I believe that's a script that you would run against replicating.
I could take that back as an action item, I believe.
So they want to use a scripting tool to be able to add and delete endpoints from Replicate.
Yeah, that's how I understand it as well.
I'm already doing it with editing, but not editing not adding and deleting.
Correct. Okay. Also, go ahead. I just wanted to add on that because of the whole repository that gets updated,
that would be taken into consideration as well.
Inside QEM, there's definitely APIs already that exists for deleting endpoints for Replicate. I'll have to check,
but due to most of these RepCTL commands not being supported for certain commands. Put in the Ideation to bring more
of them into the support list and adding them to the user guide. But as of now,
Qlik device manager does support the leading endpoints of APIs. Okay, thank you. We have another question.
For larger tables, is there a eat to force Replicate not to use parallel loading?
It says, I have a four column table with more than 3 billion records loading
in SQL Server, but replicates forces me to use parallel loading.
Is there a way to force replicate not to use parallel loading?
Sorry about that. Yeah, actually, like parallel loading is an option that we select.
But if we don't want to use that option, we can ignore that. And still, if customer wants to load
the huge tables without using the parallel load, they can use the pass through filter based on date column or like that.
Using the pass through filter, also they can use can load the table, huge tables.
But parallel load, by default, it won't be checked. That is an option that we select and we
divide the table into equal segments based on the PK. Or if the table is already partitioned
at source, we can select that partition option also. Based on the partition, it will divide a segment.
You can find the parallel loading settings if you Qlik into the table itself.
It's under the table settings and not a task setting. Just side note, for parallel loads, it would be for speed, obviously.
Just wondering why they would not want to use that option, unless it's a load on the source side.
This is all about tips, so if you got a recommendation, feel free to speak up. Sure.
Documentation on all Replicate commands with examples
Somebody's asking if there's some documentation on all replicate commands with examples.
The currently supported RepCCL commands are all listed in the user guide. If they're not in the user guide, it's not officially supported.
Though there are ones that you can use, you may find some of them listed in Community posts that aren't in the user guide.
Feel free to use them. But as we note, it's not officially supported if it's not listed in the user guide.
And now by user guide, you're referring to help. Qlik.Com? Yes, I am. Okay.
If you have a link to add to this, that'd be great.
Why does Oracle Source table require a primary key to Replicate to an Oracle target?
All right, thank you. Next question that we have, Why does Oracle Source table requires a primary key to replicate
to an Oracle target but not require a Azure Data Lake storage, ADLS?
Yeah. To apply DML changes, we need PK for RDBMS target, but ADLS is file target, so that is the difference.
Great, thanks. We got a question about upgrading.
What’s the best upgrade path from an older version?
If upgrading from two versions behind the current version of Qlik Replicate on prem, do I need to upgrade both versions or can
I skip the older upgrade and just upgrade to the newest? It's a good question.
Yeah. Actually for upgrading, there will be a path. If customer wants to upgrade from 6.6
version to latest, in accumulated delays notes, we are having the path. So based on that path, they have to go.
It will be mentioned in the user guide also, we need to follow the upgrade notes
from the user guide and customer should go the recommended way.
Even if we skip, it will work, but it is always recommended to follow the notes.
And what do the notes say? I can check that.
We can give an example. For example, customer want to upgrade from
2021.11 to latest version, then they have to go with particular versions.
I can make an example of that and I can provide where they can go and check that link.
Just give me a second. Thanks, Swathi. That'd be great. Troy, if you can move to the release notes
here inside the release notes, they're inside the release notes. It will give you a version of what you can upgrade from and the version you should upgrade to.
If you scroll all the way down to the PDFs. Thanks. And move to one of the release notes, the very first link.
Yeah. Migration and upgrade. Just click there. And this is how.
Yeah. So they have to follow this.
That's great. I appreciate that walking us to that documentation.
Roadmap for support of integration to MQTT for IoT devices?
Great. All right, thank you. Next question. It says, Anything in the roadmap for support of integration to MQTT
for IoT devices if there's the protocol for transport?
If this is the protocol for transport? This would be another one
we could set aside and follow up with our product team. But also I think this would be a good idea to request on the Ideation page to get
some traction from the group that's reviewing those request.
But as far as I know, I've not seen MQTT or Mosquitto support from Replicate.
On that topic, we got another roadmap question. They always come up, especially after Qlik World.
Any roadmap to support cloud ERPs to extract data via APIs?
Any roadmap to support cloud ERPs to extract data via APIs where database access is not possible?
I know Salesforce is supported, but I'd like to see more systems on the list. Any insight, anybody?
Otherwise, it's definitely another one we could move on and forward that to our product management team.
And the Manny's also posted a link to the Ideation page in the Qlik Community.
So it's definitely something to take a look at. And for that question on the roadmap for Cloud ERPs, we have Salesforce, Mongodb.
It's basically the ones that are under the endpoints serve.
But yes, we can follow up. But if there's something not listed, an Ideation will be best as well.
Best practice when we must truncate and reload a fact table?
Okay, thank you. Next question.
We replicate our data warehouse to clients. What is the best practice when we have
to truncate and reload a fact table due to so many changes
and parentheses faster than insert, update, and delete.
Anyone have any ideas of an easier way for them?
Fact table. I think this is related to compose that we have to check with our compose expert?
Well, we'll have to see if we can find an answer to that and get back to you.
I also mentioned that you can ask questions within Qlik Community because there are tech support experts there who are monitoring that.
So if you ask that question in Qlik Community, I'm sure that someone will give you
We need to resume or reload tasks, is there a way to run all tasks at once?
an answer or point you in the right direction for this. Okay, our next question, when a connection issue happens
and we need to resume or reload tasks, is there a way to run all tasks at once?
Through replicate, they cannot. But if they have Enterprise Manager,
we can select all tasks at once and using that, they can resume all their tasks.
But with Replicate, they can go one by one only. We are not having an option to select all the tasks.
They can install QM (Qlik Manager) and through QM they can add the replicate server. Whenever this activities happens, they can select the tables and they can resume.
I just wanted to add a comment to that. If you're running with Log Stream, that's just a no because
there's a time stamp and timeline that to replicate this free from.
How does somebody access Postgres database that resides on a QEM?
Okay, next question. How does somebody access Postgres database that resides on a QEM?
I guess is that Enterprise Manager? Yeah. Qlik Enterprise manager Windows server
to access Enterprise Manager logs for analytics purposes.
This, if they have username and password, they can connect to Postgres and QM server, same like other Postgres
instances, or else they can use third party tools like DBeaver. That's a good tip. Yeah.
I've used DBeaver or PG Admin as the Postgres client that you can once...
If you have the login information, you can connect to it. Okay, thank you. Next question, we stopped tasks to allow for source database changes and refreshes.
“Any tips for avoiding when resuming processing?
When trying to resume processes, many fails, with ISN backup errors when
trying to find backup older than we maintain, any tips for avoiding as it then requires a full reload?
They resume process and they're getting an error. That would depend on the database maintenance, how long the tasks were down,
and the retention period on the backups and where the backups are being read from.
Depending on your source, if you have archives, you can change some of the source
endpoints to point to just archives or just the backups. But it's dependent on the retention period of the transaction backup logs.
Is there a best practice for how long to keep those? It depends on the income and changes,
the amount of logs, the size of the transaction logs. It's all dependent on the source side.
Yeah. Okay.
Where can we subscribe for notifications when new releases are available?
All right, next question then. Is there a notification process that we can sign up for to proactively receive emails when new releases are available?
Sounds like the support updates blog to me. Anybody else have a better one?
Not that I know on the top of my head. We can look at the downloads page to see if there's a subscription.
I definitely want to plug the support updates blog. If you guys haven't subscribed to this one yet, you should.
Well, we had a blog post about this session today, but we also
make posts in here about maintenance happenings, new releases,
anything you need to know about, we end up posting here. So this is definitely a good one to subscribe to.
Where are Console command lines for Replicate?
Okay, thank you. Next question, is there a console command line to start resume stop a Replicate
test without having to use the enterprise manager API?
Technically, yes. I'm going to have to look through the user guide if it's listed in there. Well, that is RepCTL Start, I believe, and then the task name.
I will need a minute to check if it's listed inside the user guide. Sure, thanks.
It's always good to have some documentation to back it up.
All right, well, Allen's taken for that. I'll move on to the next question.
How does replicate handle independent data marts?
How does replicate handle independent data marts in order to avoid spaghetti architecture?
What is the best practice in this case? This also comes under Confluence.
This you have to check with data marts (Link in Description)
okay. And another note, just depending on how their data
Mart is structured, we also recommend our professional services team
can help with implementation, like taking a good look at your environment and your tables and what you're working with.
They can help give some guidance on best practices when looking at your full system and what your goals are.
That's a good point there. They can definitely help with that.
While importing tasks, are there plans to introduce a way to merge tasks?
Okay, thank you. Next question, while importing tasks, are there plans to introduce a way to merge tasks?
Any takers?
I think currently we are not having this merge tasks.
This should be a feature request. A feature request or the QCDI maybe has some functionality with this,
which is one of our newer products on that on Qlik Cloud.
There's no current way to do task merging in the UI. What most people do is they export
the task JSONs and they copy and paste tables over. That's a quicker way of
managing it without having to add tables one by one through the UI, but you have to take note that you need to be really careful in keeping
the syntax in the same way or that JSON will not import properly.
I'd like to go back to the previous question of the RepCTL start and stop tasks.
They're not officially listed in the user guide. I can provide you with a community link that has these Rep CTO tasks,
but note that these are to be used on your own and not within support.
Yeah, I just posted that link.
Thanks. The importing merging tasks, is that like you want to export one
repository to another or maybe a little more info in there?
Because we do have that capability. Okay, next question.
Is parallel loading an option with Azure Data Lake Storage target?
Is parallel loading an option with Azure Data Lake Storage target?
Yeah, I just checked user guide and we are having the option. The load is available for ADLS target.
I just bring it to user guide link where it states the source endpoints and target endpoints that support the load.
Thanks. I was just moving things over that you posted. I didn't realize it was about that question.
Hadn't read it yet. Appreciate that. You're moving ahead. All right, thank you. Next question.
Does Qlik Replicate handles foreign keys? And the previous they said, doing some tests we made,
we noticed timing issues using target_loolup_filling with FKS.
Does Qlik Replicate handles foreign keys?
No, leave it with the primary key. But in case if you observe this scenario, I think we have to take a look
and we recommend you to create a support case on this. But if anyone wants to add?
Yeah. That would be batch optimized and transactional apply, I believe.
Where obviously transactional apply is slower.
But back to the Swathi's point may look get back to you.
Because if you want to keep relational database integrity,
which is what they're doing with foreign keys, then that's transactional apply. Bunk apply, we can't retain relational integrity.
How to interpret “fail to get a value for name int tab” error?
Okay, next question. When you encounter an error in the log like this below, "fail to get a value for name int tab",
what would be your interpretation when you see this type of error? Any ideas, I guess, that are hoping for?
Just doing a quick look internally. This might happen if you have duplicate
tables in the business group for your SAP source. If that's the source that you're looking at, that's where I've seen that guidance.
I would advise checking if you have duplicate tables in business groups for this app source.
If you have duplicates, eliminate them. That's a good place to start.
But if it continues, I would say go ahead and open a support case and we'll take a look.
Okay, thank you. Next question, any roadmap to support Snowflake as a standard source endpoint?
Any roadmap to support Snowflake as a standard source endpoint?
Currently, no, for Qlik Replicate, but I think yeah, we have to check whether there is any roadmap.
But as I'm aware, no. Okay, next question.
Question 21 Part 2
The link for the Replicate command question only brought me to the introduction section.
Okay. I'm going to expand on this for the most part. These commands are listed throughout the user guide.
You search specifically for RepCTL, you'll find them listed everywhere.
In the installation part of that user guide and the upgrade part of the user guide, you will find certain commands
to export the JSON definitions of the host server, etc. There are a few more RepCTL commands listed in the user guide.
It's going to be a little hard to bring section to section in terms of where they're located. Just use the search function and you
should be able to find RepCTL commands throughout the guide. There should be a PDF version of the user guide.
I can bring that up and you can do searches on that.
I just wanted to add on the Snowflake, there is the O2B C driver you can use to pull into Snowflake.
Usually, that's done by professional services because it's a nonstandard endpoint.
Thanks for that, Bill. Sure. Next question.
Qlik On-Prem: What's the typical upgrade path to SaaS?
So they have Qlik On-prem and they're looking for... What's the typical upgrade path to SaaS?
Is there any documentation on upgrading from On-prem to SaaS?
And I know there's a lot of people working on that. There's all project managers focusing on that,
but I haven't seen the actual documentation yet. Does anyone else know?
I'm not sure if there is one officially out there, but I would say I think it should be
coming soon because I know that we were starting to have more QCDI customers, which is our cloud product, and that there should, I would say,
should be more as far as migration or how to move your similar Replicate task
to that type of product which is then hosted on the cloud. But I think that maybe one to table
and follow up with to see after we search around a little bit.
Okay, thank you. There was a question, I think it got missed and sent
to the answer section, but we'll just go over it. The question was, we are considering using a Log Stream in a OLTP for cases where we
How to normalize Log Stream tables in a OLTP in cases of failure?
are normalizing tables, we would like to know if it is safe in cases of failure to restore Log streams task without using staging tables.
I don't believe so. It's not that we use staging tables when you use Log Stream.
Log Stream task is set up to just read from the transaction logs and then right up to the server.
Then you set up your Replicate tasks to read from the Replicate server where basically all the changes are defined.
Log Stream basically makes a copy of your transaction log, puts it in this Replicate server, and then you Replicate server,
and then your replicate tasks reads from that. Okay, and the final stretch here, we'll see how many questions we can get addressed before we run out of time.
Disable replicate article, truncate the table, then reload, any advice?
The next question, we don't use compose, and I have asked the question that the community, we have to disable the replicate
article, truncate the table, then reload the table. I guess the question is, any advice on how to do that?
SQL Server, I'm assuming? Yeah, I think we'll have to make some assumptions here.
Go with your best bet. Yeah, sorry. A replicate article table and SQL Server is an article.
I just want to make sure we understand the question on the source.
I think if you want to remove the table, first we have to disable the replicator.
I mean, on the article on this equal server side, then only we'll be able to remove the table.
Otherwise, we won't be. I think that's their question. Okay, thank you.
Any SAP transports required to use the new SAP ODP connection type?
Next question, are any SAP transports required to use the new SAP ODP connection type?
That's a good question. Yeah. For OTP connection types, I would just follow the prerequisite.
Any transport, it would be listed there. Moving on to the next one.
How to get connected S 4Hana Cloud with replicate using OTP endpoint.
There isn't much documentation available on the user guide for this. We can take that back and share update to that.
It's a newer endpoint and we can definitely share documentation.
This needs to get back to you on that. All right. But since that one was anonymous, if you could always post that in
the Qlik Community forum for Replicate, and we can address it there. It's a good place for that.
How to run a Log Stream staging task for synchronizing data?
Okay, next question. When I schedule the task periodically for synchronizing data, do
I have to run both of a Log Stream staging task and stage data to the target task?
Sorry about that. When using Logs Stream, right? Yeah.
First, we have to synchronize the parent task, Logs twin task, and as well as the stage task, because stage task is the one where
it treats the changes from the logs in storage path. So both should be synchronized.
Thank you. All right, next question. Can we right click replicate logs alone to a separate drive or folder?
Can we right click replicate logs alone to a separate drive or folder?
I know we can separate data folder which contains logs, but we need log folders alone separated.
No, because logs are inside the data folder, we'll be having the log folders or files.
I think we cannot separate log folder into another drive.
Ideations, good place for that one.
Okay, thank you. Next question, any idea why this write is
Any idea why this write is needed in an Oracle database for replicate users?
needed in an Oracle database for replicate users?
Grant select on system database directories to replicate users.
I can take that out. Follow up question and reply.
That's very specific. But again, that's another anonymous one.
So if you want to pose that question in a Qlik Replicate form or open up a support case,
we'll see if we can address that and get your response with more detail. Any tips for scheduling tasks for synchronize?
I think it's vague. May need some more information on that.
It depends on your source and target. Okay.
What is the optimal way to migrate from dev to production?
Next question, what is the optimal way to migrate a development test to production when
the endpoint definition names are different between production development
and you do not want to recreate the task manually. Tried to exporting, alternating the JSON
and importing it into production without success.
Any takers? I'll dig this up. I know I submitted an Ideation a while ago
because I worked with a customer who had a very similar set up where they wanted
more of a continuous delivery CI /CD process. I'll try to follow up with that answer on this thread when I find it.
But it was the same thing was they were moving from Dev to test to prod and each time their source and targets were
pointing to the respective Dev test environment, Dev test prod servers.
That is something that has come up and there is an Ideation request out there.
Great. Also, I think they can write PowerShell script to read tables from JSONs.
We have to check whether we are having APIs on this.
If we have, I think we can provide them.
Okay. We got just a handful of questions left. Let's see if we can get through them all before we run out of time. Regarding the merging task questions
How to merge an existing task with another task containing also other tables.
follow up, I would like to merge an existing task that contains one or more tables with another task containing also other tables.
Right now, I'm using a Python script to merge the manipulations, explicit included tables, and configurations from the exported JSON.
So yeah, that was the one looking to merge existing tasks when importing.
Okay. This one isn't really a question, but it's the same as I've said before. You can merge information from one task
export, one export a task JSON to another one. You just have to be careful merging stuff because you do have to make sure all the information is in the right format.
Tables belong in the table section. Manipulations belong in the manipulation section. Transformation and transformations, etc.
Spaces, commas, all those do matter. If you do get something in the incorrect syntax, it's really hard to spot.
Normally, if you mess up on one JSON, you can still export one from the task UI to get a new JSON, but keep the original backup just in case, so you have something
to compare it to and to work through getting something merged.
Yeah, thank you. And the JSON on the exported of that sales, there should be the table list.
So within the Python script, that's where you can pull and merge it
to table, depending on how you write your path in the script. Okay, thank you.
The next question is, This person is using a single Log Stream and have multiple targets.
They are concerned about... Because when they have their doing transactions, they have, as they say, will have latencies and service issues.
Are there any timed sleep functions in Qlik Replicate?
They want to know, is there any time sleep functions in Qlik Replicate?
There's no specific sleep functions, but you can use the scheduler to stop certain tasks at scheduled points in time.
So you can use that to stop a task at certain points of time and then start it back up with the scheduler.
It's a good tip. Next question, we use Hadoop as a target endpoint.
Is there a better way to manage a single endpoint to control the HDFS folder?
Today, my customer likes to have targets replicated into separate HDFS folders. Based on my knowledge of setting endpoints
and tasks up, I create an endpoint for each unique HDFS folder. This has resulted in many endpoints,
but the other endpoint configurations are the same. Is there a better way to manage a single
endpoint to control the HDFS folder based on case at the table level within a task?
No, currently, we're not having that option. They have to go with the separate tasks only because for folder like HDFS folder
will be giving at the source endpoint configuration itself. So if they want to write to multiple
different HDFS folder, then they have to go with the different tasks only. We are not having at the table level.
If they need that feature, I think they have to go with the ideation request.
How can I do CDC if the table has no primary key?
Okay, thank you. Next to last question, how can I do CDC if the table has no primary key?
They can do CDC, but batch apply won't work and slowly it will apply in one after another.
If they want to go with the batch apply, actually they cannot because it will split into like that table batch itself will get
splitted and it will be applying one by one.
Updates will be missed for the most part. Depending on endpoints, SQL Server, you have other options such as using Microsoft CDC change tables as a way
to replicate the primary keys, but depending on the endpoints, you'll have to check the user guide to see if there's any workarounds.
Thanks. Is there a way for Qlik Replicate to run a post load script after a full load completed?
Is there a way for Qlik Replicate to run a post load script after a full load completed?
Yes, depends on the source time point. There's post processes and commands you
can pass, but it's depending on the source time point. Okay. And we'll see if we can squeeze in one last question before we wrap up.
How to resume the child task if Log Stream parent task has failed?
How to resume the child task if Log Stream parent task has failed due to some reason?
Yeah. So if their logs are available at source side and they can just resume the Log Stream task as well as the child task.
But if the logs are not available at the source side, I mean, not the logs.
If the logs are not available, obviously, they can check with the DB and they can restore the logs.
But in case, due to some latency or something, if they want to start the logs in task with timestamp,
then a different audit folder will be created with the timestamp. In that case, the chill task also, they have to start with the same time stamp.
But if they resume, they can resume. But if they start the Logs in task with time stamp, then they have to start
the chill task also with the same time stamp. Okay, that was all the time we had for today.
I want to thank everybody for your participation. We had some excellent questions. And absolutely, thank you to our wonderful panel.
Everybody, this has been great. Here's a QR code that a link to a survey that we sent you about today's session.
And you also get that once this Zoom session ends, I believe. And the recording of this will be available on Qlik Community
and on the Qlik Support YouTube channel soon, so keep an eye out for that. Thank you, everybody, and have a great rest of your day.

Labels (2)
Contributors
Version history
Last update:
‎2023-05-22 06:47 AM
Updated by: