Unlock a world of possibilities! Login now and discover the exclusive benefits awaiting you.
Hi All,
I have a simple requirement, where if a query(tSalesforceinput) doesn't return any value then follow step 1 and if it returns a value, then follow step 2. Here step1 and step 2 are creating a record using tsalesforceoutputbulkexec.
I tried using expression filter in tMap_2, where if Id==null, then take step2, and if id != null take step1 but this doesn't seem to work.
Hello,
The best way to do this is to have a simple query like "SELECT Id FROM SF_OBJECT LIMIT 1"
send this to a tHashouput
tHashoutput-->if trigger NB_LIGNE >0 ==>tSalesforceInput-->tSalesforceOutput
tHashoutput-->if trigger NB_LIGNE = 0 ==>tSalesforceInput-->tSalesforceOutput
Thanks @JohnRMK
I think i am stuck...when i tried..so as below, i have tsalesforceinput which returns ID and i belive it is stored in thasoutput....however if there are no records via query component, thashoutput will not have any value. If there are no records in tsalesforceinput, i want system to take different path
Hello,
Here you find a sample design for your case
If there is a record you will execute the green sub-job or if the query doesn't return a record, then, you will execute the red sub-job
Hello @JohnRMK Thanks a lot for taking time and preparing the sample. It has saved a lot of time!
When there are no records, it comes via ORDER.2 which is correct. However, i dont need to run a tSalesforceInput_3, as the input for tSalesforceOutputBulkExec_1 comes from various other global variables. So how do i pass that data to Create_1. When i tried with above flow, in tMap_6, i have directly provided globalvariables reference to tSalesforceoutputbulkexec
My tmap_6 between tSalesforceinput_3 and tSalesforceoutputbulkexec_1 looks like this... not sure if that's the correct way
Hey,
You can use tFixedFlowInput to send a single record with all global variable like this
Bus one record per iteration and i advice you to use tha sample tSalesforceOutput instead of the bulk who is design to deal with Millions of records.