Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 
amarvilass
Contributor III
Contributor III

Can AWS store the processed output to Qlik Sense SAAS

Hi all,

We are working on setting up a architecture where we generate some output from AWS Sagemaker. We want to check if there is any possibility that AWS can access Qlik Sense SaaS diretly and store the files in Qlik Sense SaaS.

Does anyone have experience with this? Kindly suggest.

 

Thanks & Regards

Amar

Labels (1)
  • Cloud

1 Solution

Accepted Solutions
Daniele_Purrone
Support
Support

Hi @amarvilass , if you can run API calls from your AWS environment, then you should be able to do it that way.

https://qlik.dev/apis/rest/data-files#%23%2Fentries%2Fv1%2Fdata-files-post 

Daniele - Principal Technical Support Engineer & SaaS Support Coordinator at Qlik
If a post helps to resolve your issue, please accept it as a Solution.

View solution in original post

8 Replies
Daniele_Purrone
Support
Support

Hi @amarvilass , if you can run API calls from your AWS environment, then you should be able to do it that way.

https://qlik.dev/apis/rest/data-files#%23%2Fentries%2Fv1%2Fdata-files-post 

Daniele - Principal Technical Support Engineer & SaaS Support Coordinator at Qlik
If a post helps to resolve your issue, please accept it as a Solution.
Dalton_Ruer
Support
Support

Not sure exactly what your use case is but the answer is yes. 

1. Qlik Sense SaaS has a Sagemaker connector that supports passing data to any Sagemaker endpoint to run the predictions. Those predictions could be batch "here are 10,000 records give me the predictions." Or it can be called inside a chart. Or it can be called via an expression supporting what-if scenarios. "Here I have changed this and that and the other thing about this customer if I did this would the prediction change, now go run the prediction."

2. Qlik Sense SaaS can read files from S3 buckets. So if your goal is to run massive predictions nightly, or whatever, and simply export the results for application(s) to consume we can easily support that. 

Please feel free to reach out to me directly if you have any questions directly at Dalton.Ruer@Qlik.Com and I would be happy to help guide you once I fully understand your use case. 

amarvilass
Contributor III
Contributor III
Author

Hi Daniel / Dalton,

Thanks for the response. I have checked further and realised  that the post method of Qlik SDK can be used to upload a data file from AWS. Further we are using python code to upload the data files. Also I have tried uploading the file using postman to upload the file. However I am getting errors when specifying the file names.

Will you be able to suggest how the syntax should be when we specify the name. I have referred the below link when trying the solutions.

https://qlik.dev/apis/rest/data-files#%23%2Fentries%2Fv1%2Fdata-files-post

https://community.qlik.com/t5/Integration-Extension-APIs/Data-files-POST-api-not-working/td-p/201482...

Also attached is the screenshot from the postman error.

Thanks a lot for taking time to suggest on it.

Regards

Amar

Clever_Anjos
Employee
Employee

Since you already have a S3 bucket in your architecture why not store the result there and let Qlik Cloud consume?

Clever_Anjos
Employee
Employee

I was checking how qlik-cli performs one upload (and works) 

Example:
qlik data-file create --file .\users.csv --name users2.csv --verbose

it performs a two-step process

POST https://<tenant>/api/v1/temp-contents?filename=.%5Cusers.csv

this will return a tempid to the uploaded file

and then 

POST https://<tenant>/api/v1/data-files with this payload

PAYLOAD (MULTIPART):
------
> Content-Disposition: form-data; name="Json"
PART:
{
"name": "users2.csv",
"tempContentFileId": "643593c91f4c899289e76c59"
}

amarvilass
Contributor III
Contributor III
Author

Thanks @Clever_Anjos 

Our security team doesn't want to store any non encrypted data to S3 thus we were not willing to store the output of the Python program. Further we were able to transfer the output of the python program to Qlik Cloud using Qlik SDK. 

However now we are transferring the output to Qlik Cloud and it is getting stored in the default location. Is there any way to specify the Qlik Cloud Space when using Qlik SDK to transfer the file?

Thanks in advance...

Regards

Amar

Clever_Anjos
Employee
Employee

jongore
Contributor
Contributor

Yes, it is possible to integrate AWS Sagemaker with Qlik Sense SaaS. One way to achieve this is by using Qlik's APIs to interact with Qlik Sense SaaS. Qlik provides APIs such as Qlik Sense Repository Service (QRS) API, Qlik Sense Proxy Service (QPS) API, and Qlik Sense Engine API which can be used to upload files, create data connections, and reload apps on Qlik Sense SaaS.

You can write custom scripts or use existing connectors to connect AWS Sagemaker to Qlik Sense SaaS via the APIs. One such connector is the Qlik Web Connectors, which provides pre-built connectors to various data sources including AWS S3. You can use the S3 connector to fetch the output generated by AWS Sagemaker and then use the Qlik Sense APIs to store the files in Qlik Sense SaaS.