Do not input private or sensitive data. View Qlik Privacy & Cookie Policy.
Skip to main content

Announcements
Qlik Connect 2026 Agenda Now Available: Explore Sessions
cancel
Showing results for 
Search instead for 
Did you mean: 
sri_21
Contributor
Contributor

Hive Audit capturing

Hi all,

Is it possible to capture the inserted records in a Big data standard job in Hive for capturing the Audit. My job design is a standard job. thiveinput->tmap->thdfsoutput->thiveload. 

A flow meter component will be able to capture the records inserted but i need to edit my jobs, as there are more than 100 jobs it will be difficult. Any best approach to capture job start time, end time, number of records inserted, number of records read in a hive table in talend big data standard job.

I'm using talend data fabric 6.3 enterprise edition

Regards

SS

Labels (3)
3 Replies
sri_21
Contributor
Contributor
Author

Any updates on this scenario?
Anonymous
Not applicable

Turn stats and logs at the project level. You need to be careful though about whether you use DB tables or files.  Depending on whether you are doing pure ETL, or M/R or Spark, you will need to adopt different techniques.

sri_21
Contributor
Contributor
Author

Hi,

As i'm doing a pure ETL process, rather than changing in project level i need to capture in a job level information.

In Standard Big Data job i'm not finding an option to capture the number of inserted records in a tHiveload component.

And i'm i'm able to capture the job end time in tLogcatcher and tStatCatcher, but i need to capture the job start time too.

Thanks,

SS