Do not input private or sensitive data. View Qlik Privacy & Cookie Policy.
Skip to main content

Announcements
Qlik Open Lakehouse is Now Generally Available! Discover the key highlights and partner resources here.
cancel
Showing results for 
Search instead for 
Did you mean: 
Anonymous
Not applicable

Big Data Spark Job - tCacheIn / tCacheOut throws null pointer exception

Hi,

 

I am working on the big data spark requirement where I have to use tCacheOut and tCacheIn. Attached job screen shot is working fine but in one scenario when tCacheout has nothing to store i.e. filter is not allowing to flow any row to next component, it throws null pointer exception.

 

I know, there are other alternatives like write output in disk and read again at the next step but I don't want to do that because disk read and write is always an overhead.

 

How we can handle null pointer exception in this case?

Labels (2)
1 Reply
Anonymous
Not applicable
Author

Hello
Have you defined the schema? I am not able to reproduce the error, does it occur random or you always have this error?

Regards
Shong