Unlock a world of possibilities! Login now and discover the exclusive benefits awaiting you.
source: Oracle DB
target: Databricks Lakehouse(delta)
I have enabled full load and change processing for 38 tables. Full load for 37 tables has been done
For exactly table full load keeps on restarting. It has 102,644,804 rows. After approximately 4-5 hours of running it is restarted
Found this message in logs :Full load Max file size: 100 MB, 102400 KB
Hmm, 100MB is the default under Advanced Parameters.
Did multiple 100MB CSV files get created? After how many rows did you get the message? And would that correspond with just 100MB or much more?
Was that log-file message a warning ( ... ]W: ) or and error ... ]E: ?
Once it hits te max file size it is supposed to be roll a next file and fill that to the max and so on. According to the doc: "Maximum file size: Specify the maximum file size of each target file. When the data reaches the maximum
size, the file will be closed and written to the specified target folder."
In the past - years ago - there has been an issue where exactly hitting the max cause an issue due to a bug. Maybe something like that is happening? I'd isolate the trouble table in its own task and in the interest of time first just try with a max of 5 MB or so to verify multiple small CSV files are indeed created as they should be.
Next I'd try 99 or 101 MB for the max in case it was just a bad luck boundary condition. Mind you that would be a bug and a support case would have to be made.
Now 100MB doesn't sound like a whole lot and quickly filled. Bump to 1000 MB ? (max is 2000)
Hein.
Hello Team,
Thanks for reaching Qlik community!!
Do you see any error from the logs apart from the "Full load Max file size: 100 MB, 102400 KB".
We suggest you open a support case to analyze the logs and provide better support to troubleshoot.
Regards,
Shivananda
Hmm, 100MB is the default under Advanced Parameters.
Did multiple 100MB CSV files get created? After how many rows did you get the message? And would that correspond with just 100MB or much more?
Was that log-file message a warning ( ... ]W: ) or and error ... ]E: ?
Once it hits te max file size it is supposed to be roll a next file and fill that to the max and so on. According to the doc: "Maximum file size: Specify the maximum file size of each target file. When the data reaches the maximum
size, the file will be closed and written to the specified target folder."
In the past - years ago - there has been an issue where exactly hitting the max cause an issue due to a bug. Maybe something like that is happening? I'd isolate the trouble table in its own task and in the interest of time first just try with a max of 5 MB or so to verify multiple small CSV files are indeed created as they should be.
Next I'd try 99 or 101 MB for the max in case it was just a bad luck boundary condition. Mind you that would be a bug and a support case would have to be made.
Now 100MB doesn't sound like a whole lot and quickly filled. Bump to 1000 MB ? (max is 2000)
Hein.
@chandraprakash_j_volvo - can you indicate what the root cause was, and how it was addressed? Thanks. Hein.