Unlock a world of possibilities! Login now and discover the exclusive benefits awaiting you.
Hello all,
I am extracting metadata from 20k apps and writing back to azure blob. However, it's size is becoming a bottleneck; throwing error that looks to be a limitation at azure blob write operation. I want to split the file in every 5k records. Can someone please help with a possible solution idea! loop, loop batch?
Thanks, tres
I have been working on this and learnt many things in the process. I could create custom list to iterate over by making groups from space/app list. This was from the idea that I have to break down the files in logical counter. However, the problem I am still struggling with is writing the data in the loop. Challenge remains with the 'save and close' of the file. There is no option of opening an existing file and write. That means the write is only possible when the file is created.
I don't know if I am missing something basic, and I am still going on thinking that there would be something. If anybody has similar implementation idea, request to kindly share. Thank you.