Do not input private or sensitive data. View Qlik Privacy & Cookie Policy.
Skip to main content

Announcements
Qlik Open Lakehouse is Now Generally Available! Discover the key highlights and partner resources here.

How to enable server side encryption in S3 when using Spark Local?

No ratings
cancel
Showing results for 
Search instead for 
Did you mean: 
TalendSolutionExpert
Contributor II
Contributor II

How to enable server side encryption in S3 when using Spark Local?

Last Update:

Feb 6, 2024 10:10:57 AM

Updated By:

Jamie_Gregory

Created date:

Apr 1, 2021 6:16:44 AM

Question

I want to use Spark Local with Amazon S3, and need to have server side encryption enabled in S3 when creating the target file. When using EMR or tHDFSConfiguration, I can set this property in the tHDFSConfig connector using fs.s3n.server-side-encryption-algorithm, but when using Spark Local there is no tHDFSConfiguration. How can I set this property?

 

I've tried writing a local file and then using tS3Put, but this throws a compile error as there seems to be a jar collision when a Spark Local Job and a standard Job with tS3* are coupled together.

 

Answer

You can set Hadoop configuration properties in the Spark Advanced Properties by adding the spark.hadoop. prefix. In this case, it should be spark.hadoop.fs.s3n.server-side-encryption-algorithm, and it then should be automatically injected into any Hadoop configurations that the Spark local Job creates.

Version history
Last update:
‎2024-02-06 10:10 AM
Updated by: