Unlock a world of possibilities! Login now and discover the exclusive benefits awaiting you.
Hi,
I have a subjob1 where I read a file and do simple tlogrow.
I have linked this subjob1 to subjob2 through component onsubjobok.
my tlogrow display correct values for the input file. But these values are never passed in subjob2.
I'm using big data spark jobs. how do i pass the value from subjob1 to subjob2..?
tSubJobOk - do not transfer data
in your case - you are "read and forget"
You can connect tlogrow (Main) to next component or read file again in proper place
@vapukov Thanks for the reply.
when i right click on tlogrow, it doesnt give me any option on how to connect to the next job.
My Job1 is : tfileinputparquet -> tmap1-> tlogrow (in the tmap here, i have hashmap function).
The output of the tmap1 needs to be passed to job2..
You can not pass result from job to job even in data integration and in bigdata as well
if You want use result of first job in next - save result in file (or etc)
if You use main connection - it always will be same job
but if in your case tLog it only Log, You can parallel it with file output
Thanks @vapukov, i got your point.
However in my case the scenario is different. If i output the result tfileparquetoutput, then i have to join my subjob2 input to tfileparquetoutput. However i dont have any join key in common.
So i wanted to pass the value through object by creating tjava component b/w the 2 jobs. but that doesnt work
I think - You are not got my points:
what reason of tLogrow in real job? do You want print on screen (or in TAC log files) 1-2-3-10M rows of text?
remove and connect output from tmap to proper component
also, You always can add key to output file, for example - row number ( sequenced )