Skip to main content
Announcements
July 15, NEW Customer Portal: Initial launch will improve how you submit Support Cases. IMPORTANT DETAILS
cancel
Showing results for 
Search instead for 
Did you mean: 
Anonymous
Not applicable

REG: Json file to Postgres table

Hi Team,

 

I have a json file as source, that need to be dump into postgres table. 

 

Json file :

 

[{"DateTime":"2016-09-03 14:00:00","Total_Crude_Flow_Rate_MBperDay":171.9602,"Actual_Yield_LGO_MBperDay":39.2051,"Actual_Yield_MVGO_MBperDay":22.5081,"Actual_Yield_HVGO_MBperDay":26.0575,"Actual_Yield_VGO_MBperDay":48.5655,"Actual_Yield_VR_MBperDay":22.8243,"Crude_Ambient_Temperature_DEGC":43.5956,"Desalter_Inlet_Temperature_1_DEGC":141.2322,"Desalter_Inlet_Temperature_2_DEGC":133.4597,"Resultant_Desalter_Inlet_Temperature_DEGC":138.3071,"Desalter_Outlet_Temperature_DEGC":141.5549,"Coil_Inlet_Temperature_DEGC":262.1993,"RunDate":"2019-07-26 12:51:07","SavedDate":"2019-07-26 17:49:21"}]

 

Can someone tell me how to achieve this ?  what is the loop json query ?

 

Thanks,

Prabuj

 

 

 

 

Labels (2)
6 Replies
Anonymous
Not applicable
Author

Hi Prabuj,

 

     The most easy way to handle the situation is to create the JSON metadata schema in the metadata area. You can specify the loop and see the sample results. Right now, you have not mentioned the desired output result for this record. If you are still stuck, please provide 5 input records and expected output for those input records. We can take it forward from there.

 

Warm Regards,
Nikhil Thampi

Please appreciate our Talend community members by giving Kudos for sharing their time for your query. If your query is answered, please mark the topic as resolved

Anonymous
Not applicable
Author

@nthampi  Here is the explanation below: 

 

I have json file coming to s3 bucket , every 10 mins to a folder. Each json have the same structure of columns , just with one different record in each. I need to migrate these to a table name common in postgres. How can i achieve them ? I have TAC as well, How to schedule & get it done. 

 

Json File:  For ex : 

 

[{"DateTime":"2016-09-03 14:00:00","Total_Crude_Flow_Rate_MBperDay":171.9602,"Actual_Yield_LGO_MBperDay":39.2051,"Actual_Yield_MVGO_MBperDay":22.5081,"Actual_Yield_HVGO_MBperDay":26.0575,"Actual_Yield_VGO_MBperDay":48.5655,"Actual_Yield_VR_MBperDay":22.8243,"Crude_Ambient_Temperature_DEGC":43.5956,"Desalter_Inlet_Temperature_1_DEGC":141.2322,"Desalter_Inlet_Temperature_2_DEGC":133.4597,"Resultant_Desalter_Inlet_Temperature_DEGC":138.3071,"Desalter_Outlet_Temperature_DEGC":141.5549,"Coil_Inlet_Temperature_DEGC":262.1993,"RunDate":"2019-07-26 12:51:07","SavedDate":"2019-07-26 17:49:21"}]

 

Target table Structure : ( One to one mapping ) - Postgres database

 

"Id",
"Datetime",
"TotalCrudeFlowRate",
"ActualYieldLGO",
"ActualYieldMVGO",
"ActualYieldHVGO",
"ActualYieldVGO",
"ActualYieldVR",
"CoilInletTemperature",
"DesalterInletTemperature1",
"DesalterInletTemperature2",
"DesalterOutletTemperature"

 

 

0683p000009M7Ad.jpg

 

Thanks,

Prabuj

Anonymous
Not applicable
Author

Hi @prabuj27 ,

 

    At a high level, I would do something like below.

0683p000009M734.png

 

I have created only skeleton flow but if you are stuck somewhere, we can check it during that point of time.

 

The first step is to identify the files which are candidates to load. For this part, you need to first send the date and time from which you need to read the file as parameter to the job. You can also store the last processed date in a DB table and read it from there. Store this date to a context variable.

 

  Now, read the S3 File names one at a time. Parse the date part of the file using Substring function and convert the value to a date value using Talend Date functions (You can use the same date functions from tMap to do date conversion from String). In the Run if condition, verify whether the file date is later compared to cut off date stored in context variable  using date comparison. This will make sure that all the files only after cut off date will be processed. Use a tS3Get to fetch all those files which pass the date verification.

 

In the last subjob, you will have all the required files in a local directory. You can use a tFileList to read the files one at a time and send it to JSON input file processing component by using file path variable. 

0683p000009M7Ai.png

 

In this way, all the data from multiple files will be loaded to target DB.

 

Please make sure that you are deleting the files from temp local directory after processing. If you are using DB to store the cut off date, update the value in DB also so that in next iteration, same files will not be picked from S3.

 

Warm Regards,
Nikhil Thampi

Please appreciate our Talend community members by giving Kudos for sharing their time for your query. If your query is answered, please mark the topic as resolved

Anonymous
Not applicable
Author

Hello good people

I have a problem with my first experience working with Talend OS_BD, I'm consuming Kafka messages (JSON string)  into Talend where I need to eventually output it into Postgres data warehouse. My problem is, the Postgres table is not created, if I create it manually no data is inserted in the table but It can acknowledge that the table already exists.   I'm stuck for days and I hope you can help me out. if you have any better solution to register these data in the data warehouse (Postgres) I will appreciate it. below is the brief description of my set-up

 

here is the  JSON I'm receiving from the Kafka topic before any extraction

0683p000009MZsb.png

 

I then put in place a pipeline as follows0683p000009Ma7V.png

 

I'm just selecting 2 columns from the JSON for the Postgres table, here is the output in tlogRow

0683p000009MaEu.png

 

here is  my configuration for  tkafkaInput

0683p000009MZgK.jpg

 

and the tExtractJsonFields is as follows

0683p000009MaBx.png

 

I hope you can help me with ideas on how to solve this 

 

Regards

manodwhb
Creator III
Creator III

could you please check that have you committed?
Anonymous
Not applicable
Author

Hello Manohar

I'm sorry i didnt understand .

Ngoni