Skip to main content
Announcements
Introducing Qlik Answers: A plug-and-play, Generative AI powered RAG solution. READ ALL ABOUT IT!
cancel
Showing results for 
Search instead for 
Did you mean: 
BeginnerNeedsHelpPlease

tFileList : how to keep the job running in spite of files which can't be read and how to identiy files which can't be read

Hello

I have many files in a folder and I need to read all of them.

Actually i need to read in a specific woksheet.

The issue is that some files doesn't have the woksheet I have to read.

And there are too many files, I can't open each of them and check every one of them.

I used a tFileList and I specify the folder and the worksheet.

I run the job and my issus is that when the file doesn't have the worksheet, the job stops and I have to remove the file which made the job stop and restart the job.

It is a waste of time.

Is it possible to make the job run until the end, even if it read some bad files?

I'd like to create a main job where I read all the files.

In the meantime, I have an other job which is triggered on an error by the main job and this other job will store in a file all the filenames which doesn't have the worksheet and which are supposed to trigger the error.

if you have another solution I'm also interested.

thank you

Labels (2)
17 Replies
BeginnerNeedsHelpPlease
Author

Thank you Prakhar, it almost works.

0695b00000Eax37AAB.jpgNow I have to create a child job to handle the error.

Is it something like that? I created a new job called A3 which will handle the error and it is triggered on an error coming from the tTileList

The A3 job is still empty so far, I will think about it later

0695b00000Eax4eAAB.jpg 

thank you for your help

 

Prakhar1
Creator III
Creator III

Glad it worked.

Now you can write the error handling logic in the new job or in the exiting A2 job. Depends on you.

 

One more thing the onSubjobError you have linked in a wrong way. It will be connected to tRunjob_1 and change it to OnComponentError

BeginnerNeedsHelpPlease
Author

hello again

I did the changes you asked.

i removed onSubjobError and I used OnComponentError in the same job A2.

But it looks like tUnite doesn't work because in my output fiile, I have only the last file which is written

Do you know why?

0695b00000EazAjAAJ.jpg 

gjeremy1617088143

Hi, have you checked the append existing file checkbox on tfileOutputExcel_1?

Send me Love and Kudos

Prakhar1
Creator III
Creator III

Yes you need to check the "Append" option in the tFileOutputExcel.

If you are writing one to one , then no need to use tUnite, you can directly connect tFileInputExcel to tFileOutputExcel

BeginnerNeedsHelpPlease
Author

you are both right.

Ihave to check the "Append" option.

 

I'm confused because when I wasn't using the parent-child jobs (this was my 1er message), i only had one job with:

  • tFilstList
  • tFileInputExcel
  • tUnite
  • tFileOutputExcel

 

I have never checked the "Append" option in thet FileOutputExcel.

When i made sure all my files were correct and the job run properly,, in the output file, i have data from all my input files.

And i didn't check the "Append" option. I think the tUnite composant was enough to get all my data in one file.

 

Why doesn't it work with a parent/child job?

Prakhar1
Creator III
Creator III

1)So the thing you wanted to read all the files and if some file do not have the "Sheet1" , they you don't want the job to stop.

For that the master-child concept is used

 

2) Now you want to read all the files and store it on the single file, then you have to select the "Append" option otherwise with each tRunJob run the output file will get overwritten by each new file having "Sheet1".

 

3) tUnite is helpful when you have multiple file and no exceptional cases in the file, then you can attach the files with tUnite which will basically make UNION of all the files and write on output at once.

tUnite will not collect the data with each job run , you have to attach all the files at once to load them in a single output file.

BeginnerNeedsHelpPlease
Author

thank you for your patience

I understand now