Do not input private or sensitive data. View Qlik Privacy & Cookie Policy.
Skip to main content

Announcements
Qlik Open Lakehouse is Now Generally Available! Discover the key highlights and partner resources here.
cancel
Showing results for 
Search instead for 
Did you mean: 
Anonymous
Not applicable

Dynamically define context variable values from flat file in Big data spark job

Hi,

 

I am trying to update context variable on run time from flat file. It is a big data spark job.

 

 

 

0683p000009Lu3V.png

 

I am reading csv file where it has context variable and their value. With tJavaRow component, I am assigning context variable value. The same approach is working fine with Standard job but not with Big data job.

 

Where I am wrong in this approach? How we can assign values to context variables in Big data spark job during run time from flat file?

 

Thanks..

 

 

Labels (5)
1 Reply
Anonymous
Not applicable
Author

Hi,

For BD batch jobs, the Talend recommendation is to use a DI job 'launcher' that loads the "dynamic" context.

So far, there is also no tContextLoad component for Big Data Batch and Spark Streaming.

Best regards

Sabrina