Skip to main content
Announcements
Introducing a new Enhanced File Management feature in Qlik Cloud! GET THE DETAILS!
cancel
Showing results for 
Search instead for 
Did you mean: 
Krecik
Contributor II
Contributor II

How to log each job in specific table

Hello everybody,

 

I'm asking for your help

 

I've created an Audit table (PostgreSQL) and i want to populate it with each of my job.

 

Fields of my audit table : copy/paste from tLogCatcher's schema

 

I've created 5 jobs and in each of them there are :

 

  • tWarn
    • Message : Job finished with statuts OK
    • Code : 0
    • Priority : Info
  • tDie :
    • Message : Job error
    • code d'erreur : 1
    • priority : error
  • tDie :
    • Message : Cannot open tDBconnection (PostgreSQL)
    • code d'erreur : 2
    • Priority : error
  • tLogCatcher
    • check box : Capture tDie
    • check box : Capture tWarn

I join you a screenshot of my job (cf : Job_Design.png)

 

I've created a folder "Traitement" which contains : 2 master Jobs (cf : MasterJob_ods_fact.png)

  • First master job : there are 3 jobs from folder ODS ( J0001_CHARGEMENT_ODS)
  • Second master job : there are 2 jobs from folder FAITS ( J0002_CHARGEMENT_FAITS)

Then i created a master job which contains : J0001_CHARGEMENT_ODS + J0002_CHARGEMENT_FAITS (cf : MasterJOB.png)

 

When i start the master job i have all my log in the console. now how can i send all these data in my table, and i'm stuck there....

 

ideally, I would like to be able to do this:
All my sub-job:

 - insert the logs in the audit table with all the following information: PID, job name, number of lines inserted, start date, end date, run time, job status, code error, Message
As for my job master :

- insert in a dedicated table the following information :
PID, Job name, start date, end date, run time, status

 

If my master job has an error, with its PID, i can check in audit table where is the issue.

 

Is that possible? If so, can you help me on this subject? I would be grateful.

 

Regards,

Krecik

 

Technical information :

Database : PostgreSQL

Talend Open Studio for Data Integration (the free version)

Version : 7.2.1.20190614_0242

Labels (2)
1 Reply
JohnRMK
Creator II
Creator II

Hello,
There are several methods to capture logs and put them in tables.


Here's a simple approach you can do with TOS


In your job:
A tLogCatcher -> tMap_1 -> tMap_2 -> table_1
                                                               -> TMap_3 -> Table_1
                                                                ^ | (lookup)
                                                                  Table_1
In the first tMap_1 => You are going to change some columns of the tLogsCatcher with the global  variables 

 

Moment will become start date of execution for example. You can add columns for the number of insertions or update.

 

Important detail, you need the tLogCatcher to run twice in a job. The start of extraction and the end of extraction. 

so in all your jobs, you will have a


tWarn (message + code for example 100)
|
|
tDBInput -> tMap -> tDBOutput
|
|
tSetGlobalVar (context.nb_insert = ((Integer)globalMap.get("tPostgresqlOutput_X_NB_LINE_INSERTED"))
|
|
tWarn (message + code 200)

 

| | is an OnSubjobOk type trigger


100 start and 200 end of extraction

 

 

tMap_2 you will have two outputs with two filters on the rows


A code filter == 100 ==> insertion of the job name and the start exec (Table_1)


A code filter == 200 ==> Update with global variables (insertion, time => end date, ...) (tMap_3)

 

 

In the second output of tMap2, which is tMap_3, you will have a lookup table and you must perform a join to retrieve the information from the first insertion (execution ID, start date), then you will update the table_1.

 

You have different ways to do it, leave me your email in private, I will try to develop an example of a job and share with you some screenshots.