Do not input private or sensitive data. View Qlik Privacy & Cookie Policy.
Skip to main content

Announcements
Join us in Bucharest on Sept 18th for Qlik's AI Reality Tour! Register Now
cancel
Showing results for 
Search instead for 
Did you mean: 
Anonymous
Not applicable

Using QlikSense variable in Spark SQL query (Simba connector)

Hello folks,

Need help with using a qliksense variable in Spark SQL query.

I have made connection using Simba Spark and its working fine. But could not figure out what i am doing wrong here.

Basically, here i want to fetch data for all the bucket_id in Data table. I can use loop and fetch one by one or i can pass a list using where in statement since spark is querying cassandra and i have constraints with queries. Given query below, would be great if someone can help me with this.

Data:

LOAD bucket_id

resident OD;

LIB CONNECT TO 'Simba Spark (qlik-sense_administrator)';

let noRows = NoOfRows('Data')-1;

for i=0 to $(noRows)

let vVar=FieldValue('bucket_id',$(i));

:

LOAD *, $(vVar);

SQL SELECT bucket_id

FROM SPARK.database1.x

where bucket_id='$(vVar)';

     next i

Here, in given above code, i am getting error as :

The following error occurred:

SQL##f - SqlState: S1000, ErrorCode: 35, ErrorMsg: [Simba][Hardy] (35) Error from server: error code: '0' error message: 'org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 764.0 failed 4 times, most recent failure: Lost task 0.3 in stage 764.0 (TID 94867, ip-172-31-30-155.ap-southeast-1.compute.internal): java.io.IOException: Exception during execution of SELECT "bucket_id" FROM "database1"."x" WHERE "bucket_id" = ? ALLOW FILTERING: Key may not be empty at com.datastax.spark.connector.rdd.Cas

The error occurred here:

SQL SELECT bucket_id FROM SPARK.database1.x where bucket_id=''

1 Reply
Anil_Babu_Samineni

First try without WHERE condition then see.

Best Anil, When applicable please mark the correct/appropriate replies as "solution" (you can mark up to 3 "solutions". Please LIKE threads if the provided solution is helpful