Do not input private or sensitive data. View Qlik Privacy & Cookie Policy.
Skip to main content

Announcements
Qlik Open Lakehouse is Now Generally Available! Discover the key highlights and partner resources here.
cancel
Showing results for 
Search instead for 
Did you mean: 
Darmesh
Contributor III
Contributor III

Loading problem in talend:

 

I have 233 records and i am loading into the excel.

0683p000009LyHq.png


But it is loading only 43 records and after that it is stuck.

If i remove the onsubjobOK and run separately it is running fine

Labels (2)
7 Replies
Jesperrekuh
Specialist
Specialist

This design looks horrible hahahah, I suggest creating subJobs... beside that,
there's a onComponentOk in your design... dummy ->OnComponentOk -> tFileExcelWorkbookOpen ->
Darmesh
Contributor III
Contributor III
Author

@Dijke @manodwhb @TRF

 

Still i dont understand. can you explain me.

How to load all the data

Jesperrekuh
Specialist
Specialist

@Darmesh,
You load data into excel... but after 43 it's stuck... do you get an error?
There's a checkbox in Excel component stop to read on empty rows?
Beside the checkbox it works without onSubjobOk, clearly that's not the problem?


DB Table -> 233 records -> Excel1
OnSubjobOk -> Excel1 -> only 43 records -> processing data and -> ExcelFinal
Your picture of your jobflow is kinda hard to understand.

 

 

 


a.PNG
Darmesh
Contributor III
Contributor III
Author

consider this flow :

1st : from table to excel

2nd : From the excel sheet loading to the default template

This 2 process is happening in my flow.

So the problem is, If the 1st and 2nd process put together in single job OR connected using 2 sub jobs --- Not working it gets stuck till 45 records

If i run the second flow alone separately it is running fine

Jesperrekuh
Specialist
Specialist

Why not creating multiple outputs:
in tMap -> out1 (order1) -> MySQLOut
-> out2 (order2) -> ExcelOutFile -> continue the row to -> treplace -> tmap_40 -> tFlowtoIterate -> ExcelOutFinal

This is how you should do it

Darmesh
Contributor III
Contributor III
Author

Here is what i am doing

0683p000009Lyhq.pngJob Design0683p000009Lyhr.pngreport input file0683p000009LyqO.png0683p000009LysX.png0683p000009Lyqr.png0683p000009Lysc.png0683p000009LyRd.pngreplacing negative values with ()0683p000009Lysh.pngcreating multiple excel sheets0683p000009LyqX.png0683p000009LyoC.pngoutput template

Jesperrekuh
Specialist
Specialist

No idea buddy on what's going on in your job.
Are you sure the file is closed before opening context.report_excel (this file is generated in previous dbload), so this is the file with 250+ records? And only reads 43?
Or
its only able to write 43 to tFileExcelWorkbookopen_5 ?
Or
You sure it reads the correct file context.report_excel? I suggest generating files using a global var or something like "<datetime>_out_<processid>.xls"

YOu could place tLogRow directly after the xls context.report_excel and make sure all records
are processed before tmaps, replace etc?

Why not creating a subjob which is run after the previous steps are finished? Im not a big fan of creating big flows... break it down and using a subjobok as indicator to create a subjob is a very nice indicator, also for joblogging and restarting a part of a jobchain.
Because you dont need to query the db for solving this file issue, you just creating the same file over and over again...