I have a table with 400.000 rows and each row has its own xml information with multiple fields. I need to read this table and convert each row in multiple fields with values. All rows create the same fields.
I was try to use from field sentence and make a for cycle for each row, but it works slowly and the process will take around 32 hours. Somebody Knows how to do this converting all information at the same time.
I think your approach is quite fast - for this what happens ... it needs around 0.3 seconds to initialize a load which always has an overhead (which is here much larger as the real data).
The only way which I could imagine at the moment is to adjust the xml. This means to remove (with various string-functions) the header/footer from the xml and/or adjust/extend it with additionally information like the record-id or something similar and to add the content row by row in a (preceeding) load (chain) with interrecord-functions like previous/peek. I'm not sure if it will work with all records in a single run but it this approach should reduce the number of loadings significantely. And of course in the end it must return a valid xml-statement again.