Skip to main content
Announcements
Introducing Qlik Answers: A plug-and-play, Generative AI powered RAG solution. READ ALL ABOUT IT!
cancel
Showing results for 
Search instead for 
Did you mean: 
urend
Contributor
Contributor

Iterating through a big csv file

Hi there,

I have a CSV file with about 2,500 records. When I iterate through each one, I want to call about 5-10 API endpoint per record in the CSV file. What is the best way to do this, that will be the most efficient and quickest. I currently have it set up with no multiprocessing, but I believe I wouldn't be able to do that, as the API endpoints that check if a record exists, wouldn't be up to date, due to multiple records being worked on at any one point. Any ideas on the best way to solve this? Cheers

Labels (1)
  • Other

1 Reply
Shicong_Hong
Support
Support

Hello

Use the tFlowToIterate component to iterate over each record, and enable parallel execution, as shown bellow. 

Shicong_Hong_0-1724051191820.png

Regards

Shicong