Do not input private or sensitive data. View Qlik Privacy & Cookie Policy.
Skip to main content

Announcements
Join us in Bucharest on Sept 18th for Qlik's AI Reality Tour! Register Now
cancel
Showing results for 
Search instead for 
Did you mean: 
Anonymous
Not applicable

How to put a conditional check on data flow/row connector in TOS job?

Hi Team,

In a job, say data is flowing from tFilterRow to tFlowToIterate component, then how can i put a condition ON THE DATA FLOW OR ROW CONNECTOR to check ( if rowcount > 0) then only pass data to tFlowToIterate otherwise stop the process.

Thanks for your help and time!

Rera

Labels (2)
4 Replies
Anonymous
Not applicable
Author

The components linked from tFlowToiterate will not work if no rows. What's your data source? Usually,  there is a global variable that counts the total number of lines are read from the data source, you can use this global variable to check if it need to continue to process next subjob. 
Anonymous
Not applicable
Author

Hi Shong,

Thanks so much for your response. I will try to test if tFlowToIterate with no rows output works for my scenario. 

But, currently my source is a continuous stream of json messages read by tKafkaInputComponent which has timeout set to "-1". In this scenario, i plan to replicate the tKafkainnput component output to multiple branches say approx. 20+ in a single job. These 20+ branches represents 20 different DB tables child jobs. Single tKafkaInput component will read data for any of the tables with different json schema.

Say, i recieve JSON1 message for Table 1, JSON2 for Table 2, Json3 for Table 1 likewise messages can come for any of the 20+tables. I would like, when my job receives JSON1 message, then only BRANCH1 which is for TABLE 1 child job should run and call child job1 to extract JSON1 message and populate data in TABLE 1. Likewise, if i receive JSON2 message which is for Table 2, then i want all the branches to remain idle EXCEPT BRANCH 2 and it should be active & should call child job2 which will load Table 2.

Please suggest/guide me which components will help me to:
A. allow only selective branch to be active and call child job for which JSON message has arrived 

B. pass json message(data stream) to child job and then extract it and process it to load the data into DB tables
Thanks so much for your great support and time.
Rera
Anonymous
Not applicable
Author

Hi 
The main problem now is how to the check the message is for which table? Is there any indentify code to distinguish the message type? Ideally, read the message and parse it, use runIf link to fire corresponding child job based on the message type. 

 
Anonymous
Not applicable
Author

Thanks, Shong!

Yes, there is an identifier (metadata for tablename) in the json message that we receive from kafka.