Skip to main content
Announcements
July 15, NEW Customer Portal: Initial launch will improve how you submit Support Cases. IMPORTANT DETAILS
cancel
Showing results for 
Search instead for 
Did you mean: 
Pranita123
Partner - Contributor III
Partner - Contributor III

API Calling In Qlik Replicate

Hi Team,

We have a scenario where we need to call an API to manage endpoint connections. Specifically, our target is Hadoop, and within the advanced settings, there is an option to set a file size. I have attached a snapshot for your reference.

Therefore, we need to utilize an API that can manage file size. Is this functionality possible within Replicate?

If yes then could you please share the stapes to do the same

 

Thank you.

 

Labels (4)
3 Replies
john_wang
Support
Support

Hello @Pranita123 ,

Thanks for reaching out to Qlik Community!

Besides the GUI operation, we can edit the task and endpoints settings by command line: export a task, edit and then import a task. However we'd like to understand why you want to change the File size reaches property on the fly?

In general when Qlik Replicate working with Hadoop target, Replicate first writes the records to be replicated into local files on Replicate server (each table has a separate set of files), once a
file reaches the maximum size defined in the endpoint advanced tab File size reaches , it is sent to HDFS and only after successful send, the local file is deleted. The setting can be a  predefined according to your hardware/software configurations and performance demands. Personally I do see the need to change it dynamically.

Hope this helps.

John.

Help users find answers! Do not forget to mark a solution that worked for you! If already marked, give it a thumbs up!
SushilKumar
Support
Support

Hello Team,

 

If our response has been helpful, please consider clicking "Accept as Solution". This will assist other users in easily finding the answer. if you require further assistance, please raise a Support case. 

 

Regards,

Sushil Kumar

Heinvandenheuvel
Specialist III
Specialist III

As per @john_wang - why would one want to update max file size more or less dynamically? I'm writing more or less because the endpoint characteristics are only evaluated once when the task starts so even if you change it with an API then you would also need to stop and resume all tasks using the endpoint.

Note... I would NOT re-import a task to refresh endpoint details. You may have to export a task (or export all) to get the json but then I would strip everything except the path down do databases.<endpoint-to-uodate>. Everything before, everything after, but not the path down to it.

Personally I'd fo as far as maintaining a source repository just for my endpoints and only ever manage them as endpoints, not as task components. This approach is very very important if more than 1 task uses the same endpoint. I also strip the databases from the task definitions before putting tasks under source control. That's the only way to ensure that when you reload some task it does not accidentally reset endpoint setting along with it.

Hein.