Do not input private or sensitive data. View Qlik Privacy & Cookie Policy.
Skip to main content

Announcements
Join us to spark ideas for how to put the latest capabilities into action. Register here!
cancel
Showing results for 
Search instead for 
Did you mean: 
Anonymous
Not applicable

Generating to many tokens for single query when using tSalesforceInput

Hello, 

 

I have a following problem.

I need get from Salesforce ID of SoacialPersona object, filtered by some conditions. I cannot select the whole SoacialPersona object as it is to large, thus I send smaller queries in a loop. I use tSalesforceInput in Query mode (BULK mode is not allowed because there is a 10000 limit of connections to Salesforce per 24h, which would be exceeded).

The problem is that for single query which is sent to Salesforce in tSalesforceInput in Query mode, Talend generates more than one token to communicate with Salesforce. In consequence, Salesforce locks out the user when 3600 connections per 1h is exceeded. The limit is exceeded even though numbers of queries is much more less than 3600, lets say about 2000.

My question is, can I force in Talend to generate and use one token per each single query which is sent to Salesforce in tSalesforceInput in Query mode?

Or maybe it is possible to use one token for all of the queries? 

 

I will be grateful for your help.

 

Labels (3)
3 Replies
TRF
Champion II
Champion II

Hi,

As you don't have shared your job, I think you have a design problem.

As you know you have to iterate over the tSalesforceInput component, you should connect to salesforce from outside of the loop using a tSalesforceConnection component then reuse the opened connection into the tSalesforceInput.

 

tSalesforceConnection
|
+ (on Subjoub OK)
|
+ tLoop --> tSalesforceInput(using the opend connection)

 

Anonymous
Not applicable
Author

Thank you for the suggestion, I will try to do it this way and test it.

Do think it will be possible to have 'one opened connection' to Salesforce within couple of hours? This is what I am afraid of, that this connection will be broken because of Timeout.

 

Normally my flow works OK and it is completed within 10 or 20 minutes. Problem occurs only in case of massive data transformations, which unfortunately happens from time to time. When I need to process e.g. 10M of data, it takes couple of hours.

 

 

TRF
Champion II
Champion II

No problem as soon as the connection is active