Unlock a world of possibilities! Login now and discover the exclusive benefits awaiting you.
Jul 8, 2024 7:57:19 AM
Oct 20, 2021 12:10:28 PM
You can observe your Data Integration Jobs running on Talend Remote Engines if your Jobs are scheduled to run on Talend Remote Engine version 2.9.2 or later.
This is a step-by-step guide on how Talend Cloud Management Console can provide the data needed to build your own customized dashboards, with an example of how to ingest and consume data from Microsoft Azure Monitor.
Once you have set up the metric and log collection system in Talend Remote Engine and your Application Performance Monitoring (APM) tool, you can design and organize your dashboards thanks to the information sent from Talend Cloud Management Console to APM through the engine.
Content:
This document has been tested on the following products and versions running in a Talend Cloud environment:
Optional requirements for obtaining detailed Job statistics:
To configure the files and check that the Remote Engine is running, navigate to the Monitoring Job runs on Remote Engines section of the Talend Remote Engine User Guide for Linux.
Use any REST client, such as Talend API Tester or Postman, and use the endpoint as explained below.
GET http://ip_where_RE_is_installed:8043/metrics/json8043 is the default http port of Remote Engines. Replace it with the port you used when installing the Remote Engine.
GET http://localhost:8043/metrics/json Authorization Bearer F7VvcRAC6T7aArU
There are numerous ways to push the metric results to any analytics and visualization tool. This document shows how to use the Azure monitor HTTP data collector API to push the metrics to an Azure log workspace. Python code is also used to send the logs in batch mode at frequent intervals. Alternatively, you can create a Talend Job as a service for real-time metric extraction. For more information, see the attached Job and Python Code.zip file.
The logs are pushed to the Azure Log Analytics workspace as “custom logs”.
Talend Cloud Management Console provides metrics through Talend Remote Engine. They can be integrated in your APM tool to observe your Jobs.
For the list of available metrics, see Available metrics for monitoring in the Talend Remote Engine User Guide for Linux.
Query:
Remote_Engine_OBS_CL |where TimeGenerated > ago(2d) |where name_s=='component_connection_rows_total' |summarize sum(value_d) by context_target_connector_type_s |render piechart
Chart:
Query:
Remote_Engine_OBS_CL |where TimeGenerated > ago(2d) |where name_s=='component_execution_duration_seconds' |summarize count(), avg(value_d) by context_artifact_name_s,context_connector_label_s
Chart:
Query:
Remote_Engine_OBS_CL |where name_s=='os_memory_bytes_available' or name_s =='os_filestore_bytes_available' |summarize sum(value_d)/1000000 by name_s
Chart:
Query:
Remote_Engine_OBS_CL |where TimeGenerated > ago(2d) |where name_s =='jvm_process_cpu_load' |summarize events_count=sum(value_d) by bin(TimeGenerated, 15m), context_artifact_name_s |render timechart
Chart:
This section explains the sample Job used to send the metric logs to the Azure log workspace. This Job is available in the attached Job and Python Code.zip file.
The components used and their detailed configurations are explained below.
tREST
Component to make a REST API Get call.
tJavaRow
The component used to print the response from the API call.
tFileOutputRaw
The component used to create a JSON file with the API response body.
tSystem
Component to call the Python code.
tJava
Related Content
Hello Team, The job attachments are missing. Please advise. Thank you