<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Re: AWS S3 Storage and DataBricks Data Connection in Connectivity &amp; Data Prep</title>
    <link>https://community.qlik.com/t5/Connectivity-Data-Prep/AWS-S3-Storage-and-DataBricks-Data-Connection/m-p/2004394#M11621</link>
    <description>&lt;P&gt;&lt;a href="https://community.qlik.com/t5/user/viewprofilepage/user-id/206291"&gt;@tamster&lt;/a&gt;&amp;nbsp;, Please feel free to create a support case for further investigation. Please provide tenant id, timestamp, space id etc..&lt;/P&gt;</description>
    <pubDate>Tue, 15 Nov 2022 05:14:22 GMT</pubDate>
    <dc:creator>SankarReddyK</dc:creator>
    <dc:date>2022-11-15T05:14:22Z</dc:date>
    <item>
      <title>AWS S3 Storage and DataBricks Data Connection</title>
      <link>https://community.qlik.com/t5/Connectivity-Data-Prep/AWS-S3-Storage-and-DataBricks-Data-Connection/m-p/2002618#M11591</link>
      <description>&lt;P&gt;I have configured a Data Connection with DataBricks and AWS 3S Storage as per the screenshot below:&lt;/P&gt;
&lt;P&gt;&lt;span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="tamster_0-1668014249476.png" style="width: 400px;"&gt;&lt;img src="https://community.qlik.com/t5/image/serverpage/image-id/93312i83C7F2C1AE3AC432/image-size/medium?v=v2&amp;amp;px=400" role="button" title="tamster_0-1668014249476.png" alt="tamster_0-1668014249476.png" /&gt;&lt;/span&gt;&lt;/P&gt;
&lt;P&gt;I test both connections and they were successful.&amp;nbsp;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;When I execute "Prepare" as per screenshot below:&lt;/P&gt;
&lt;P&gt;&lt;span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="tamster_1-1668014324586.png" style="width: 400px;"&gt;&lt;img src="https://community.qlik.com/t5/image/serverpage/image-id/93313i2C69DDECDA4E9CF2/image-size/medium?v=v2&amp;amp;px=400" role="button" title="tamster_1-1668014324586.png" alt="tamster_1-1668014324586.png" /&gt;&lt;/span&gt;&lt;/P&gt;
&lt;P&gt;I get the following error:&lt;/P&gt;
&lt;P&gt;&lt;SPAN&gt;Failed to prepare data app 'Onboarding_Landing' in project 'lsco_p2c' ('636b81a41b514a5bd2a3b2b2') - 'message: QDI-DW-STORAGE-ENDPOINT-SYNTAX-ERROR, SQL syntax error, s3://r42qlik/lsco_p2c/onboarding_landing/jc_p2c_final_7_days_all_time: getFileStatus on s3://r42qlik/lsco_p2c/onboarding_landing/jc_p2c_final_7_days_all_time: com.amazonaws.services.s3.model.AmazonS3Exception: Forbidden; request: HEAD &lt;A href="https://r42qlik.s3.eu-central-1.amazonaws.com" target="_blank" rel="noopener"&gt;https://r42qlik.s3.eu-central-1.amazonaws.com&lt;/A&gt; lsco_p2c/onboarding_landing/jc_p2c_final_7_days_all_time {} Hadoop 3.3.2, aws-sdk-java/1.12.189 Linux/5.4.0-1086-aws OpenJDK_64-Bit_Server_VM/25.345-b01 java/1.8.0_345 scala/2.12.14 vendor/Azul_Systems,_Inc. cfg/retry-mode/legacy com.amazonaws.services.s3.model.GetObjectMetadataRequest; Request ID: RF82360G030ECDF2, Extended Request ID: GCtFNnHcr/0FUP0uJ43MyMxviVTEllVwXxd5YKrzP1xRlBSB+5y/mEy9HlI4gGBcyBTHnR8ST2f1dOzQbs6Ydg==, Cloud Provider: AWS, Instance ID: i-034c0c5312d5de1c1 (Service: Amazon S3; Status Code: 403; Error Code: 403 Forbidden; Request ID: RF82360G030ECDF2; S3 Extended Request ID: GCtFNnHcr/0FUP0uJ43MyMxviVTEllVwXxd5YKrzP1xRlBSB+5y/mEy9HlI4gGBcyBTHnR8ST2f1dOzQbs6Ydg==; Proxy: null), S3 Extended Request ID: GCtFNnHcr/0FUP0uJ43MyMxviVTEllVwXxd5YKrzP1xRlBSB+5y/mEy9HlI4gGBcyBTHnR8ST2f1dOzQbs6Ydg==:403 Forbidden.,traceId:1128fdb4d1bcc8b0a85c818b53fd5714'&lt;/SPAN&gt;&lt;/P&gt;
&lt;P&gt;As the error suggests a permission issue with the AWS S3 Bucket -&amp;nbsp;&lt;SPAN&gt;r42qlik.&amp;nbsp; I made sure that it had the following Bucket Policy:&lt;/SPAN&gt;&lt;/P&gt;
&lt;P&gt;{&lt;BR /&gt;"Version": "2012-10-17",&lt;BR /&gt;"Id": "Policy1668006475802",&lt;BR /&gt;"Statement": [&lt;BR /&gt;{&lt;BR /&gt;"Sid": "Stmt1668006452163",&lt;BR /&gt;"Effect": "Allow",&lt;BR /&gt;"Principal": {&lt;BR /&gt;"AWS": "arn:aws:iam::357942615838:user/tamster"&lt;BR /&gt;},&lt;BR /&gt;"Action": [&lt;BR /&gt;"s3:DeleteObject",&lt;BR /&gt;"s3:GetBucketLocation",&lt;BR /&gt;"s3:GetObject",&lt;BR /&gt;"s3:ListBucket",&lt;BR /&gt;"s3:PutObject"&lt;BR /&gt;],&lt;BR /&gt;"Resource": [&lt;BR /&gt;"arn:aws:s3:::r42qlik",&lt;BR /&gt;"arn:aws:s3:::r42qlik/*"&lt;BR /&gt;]&lt;BR /&gt;}&lt;BR /&gt;]&lt;BR /&gt;}&lt;/P&gt;
&lt;P&gt;When sconfiguring AWS S3 Storage I used the principal's above access and secret key.&amp;nbsp;&lt;/P&gt;
&lt;P&gt;Any help appreciated.&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Wed, 09 Nov 2022 17:23:30 GMT</pubDate>
      <guid>https://community.qlik.com/t5/Connectivity-Data-Prep/AWS-S3-Storage-and-DataBricks-Data-Connection/m-p/2002618#M11591</guid>
      <dc:creator>tamster</dc:creator>
      <dc:date>2022-11-09T17:23:30Z</dc:date>
    </item>
    <item>
      <title>Re: AWS S3 Storage and DataBricks Data Connection</title>
      <link>https://community.qlik.com/t5/Connectivity-Data-Prep/AWS-S3-Storage-and-DataBricks-Data-Connection/m-p/2004394#M11621</link>
      <description>&lt;P&gt;&lt;a href="https://community.qlik.com/t5/user/viewprofilepage/user-id/206291"&gt;@tamster&lt;/a&gt;&amp;nbsp;, Please feel free to create a support case for further investigation. Please provide tenant id, timestamp, space id etc..&lt;/P&gt;</description>
      <pubDate>Tue, 15 Nov 2022 05:14:22 GMT</pubDate>
      <guid>https://community.qlik.com/t5/Connectivity-Data-Prep/AWS-S3-Storage-and-DataBricks-Data-Connection/m-p/2004394#M11621</guid>
      <dc:creator>SankarReddyK</dc:creator>
      <dc:date>2022-11-15T05:14:22Z</dc:date>
    </item>
    <item>
      <title>Re: AWS S3 Storage and DataBricks Data Connection</title>
      <link>https://community.qlik.com/t5/Connectivity-Data-Prep/AWS-S3-Storage-and-DataBricks-Data-Connection/m-p/2005134#M11625</link>
      <description>&lt;P&gt;Unfortunately, as I'm in a trial I do not have the option to create and add commets to a support case directly.&amp;nbsp; I did receive a response from support with case number -&amp;nbsp;00059510 with the following comments:&lt;/P&gt;
&lt;P&gt;As per the documentation, when using Databricks on AWS with tables without a primary key, reloading the tables in the landing will fail in the Storage app. The create statement does not have any PK. &lt;BR /&gt;&lt;BR /&gt;To resolve this you can either: &lt;BR /&gt;- Define a primary key in the tables. &lt;BR /&gt;- Set spark.databricks.delta.alterTable.rename.enabledOnAWS to True in Databricks. &lt;BR /&gt;&lt;BR /&gt;Source: &lt;A href="https://help.qlik.com/en-US/cloud-services/Subsystems/Hub/Content/Sense_Hub/DataIntegration/TargetConnections/databricks-target.htm" target="_blank"&gt;https://help.qlik.com/en-US/cloud-services/Subsystems/Hub/Content/Sense_Hub/DataIntegration/TargetConnections/databricks-target.htm&lt;/A&gt; &lt;/P&gt;
&lt;P&gt;I have yet to try this.&amp;nbsp; However, I have since beeen able to upload data from DataBricks without the need for setting up an AWS staging S3 bucket.&amp;nbsp; I'm wondering what is the purpose of having an S3 bucket.&amp;nbsp;&amp;nbsp;&lt;BR /&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Wed, 16 Nov 2022 08:47:32 GMT</pubDate>
      <guid>https://community.qlik.com/t5/Connectivity-Data-Prep/AWS-S3-Storage-and-DataBricks-Data-Connection/m-p/2005134#M11625</guid>
      <dc:creator>tamster</dc:creator>
      <dc:date>2022-11-16T08:47:32Z</dc:date>
    </item>
  </channel>
</rss>

