Unlock a world of possibilities! Login now and discover the exclusive benefits awaiting you.
Inside Qlik Application Automation, the Amazon S3 functionality is split into two connectors: the native Cloud Storage connector and the specific Amazon S3 connector. To create, update, and delete files, it’s highly recommended to use the native Cloud Storage connector. To get the information and metadata of regions & buckets, use the Amazon S3 connector.
The following is an example of automation using the Amazon S3 connector to output a paginated list of regions and buckets in each region (not covered in this article).
This article focuses on the available blocks in the native Cloud Storage connector in Qlik Application Automation to work with files stored in S3 buckets. It will provide some examples of basic operations such as listing files in a bucket, opening a file, reading from an existing file, creating a new file in a bucket, and writing lines to an existing file.
The Cloud Storage connector supports additional building blocks to copy files, move files, and check if a file already exists in a bucket, which can help with additional use cases. The Amazon S3 connection also supports advanced use cases such as generating a URL that grants temporary access to an S3 object, or downloading a file from a public URL and uploading this to Amazon S3.
Let’s get started.
Authentication for this connector is based on tokens or keys.
Log in to the AWS console with an IAM user to generate the access key ID and secret access key required to authenticate.
Now let's go over the basic use cases and the supporting building blocks in the Cloud Storage connector to work with Amazon S3:
The Amazon S3 connector in Qlik Application Automation now supports adding the SSE header value for creating new files. This header is available on the Create File and Copy File blocks. It's possible to choose the default behavior which is AES256 encryption. As an alternative, it's possible to choose aws:kms encryption and provide a valid KMS Key ID.
Attached example file: create_and_write_files_amazon_s3.json
The information in this article is provided as-is and to be used at own discretion. Depending on tool(s) used, customization(s), and/or other factors ongoing support on the solution below may not be provided by Qlik Support.
Hello,
Thank you for the documentation.
Does Amazon S3 connector on Qlik Automations support server side encryption (SSE)? I am not able to pass the SSE header value on automations but I can do it on the data connection configuration inside the app.
Hi,
Answering for consistency, (since this was answered on the forum here)
Answser: This is indeed not possible in the Amazon S3 connector in automations.
If you require this functionality, I suggest you ask for it through ideation.
Hi,
Is it possible to read data from qvd using the read data from the file block on Amazon S3 ?
Hello @OppilaalDaniel
Please post it in the Qlik Application Automation forum to give your question appropriate reach and attention,
All the best,
Sonja
hi,
Is it possible to open a file (csv or json) and write a line to it and then close it? In this scenario, i have a file that i want to update with the line every time the job runs.
Thank you for your answer.
Best regards,
Bogdan