Unlock a world of possibilities! Login now and discover the exclusive benefits awaiting you.
Hi,
I am working on the big data spark requirement where I have to use tCacheOut and tCacheIn. Attached job screen shot is working fine but in one scenario when tCacheout has nothing to store i.e. filter is not allowing to flow any row to next component, it throws null pointer exception.
I know, there are other alternatives like write output in disk and read again at the next step but I don't want to do that because disk read and write is always an overhead.
How we can handle null pointer exception in this case?