Unlock a world of possibilities! Login now and discover the exclusive benefits awaiting you.
Our target endpoint is a MySQL db. MySQL db has a max row limit of 65535 bytes. We have a few tables having wide columns in range of 400-900 columns. The calculation of the max row limit size is adding the bytes size of all the columns in a table. This can be tedious task if a table has many columns.
I saw that Qlik has a approx. max row limit. I am attaching the screenshot showing that. Can I check if I could use this as an estimate for the max row limit calculation?
The aproximate row size in the screen shot provided refers to the row size in the source. There is no similar information related to target endpoint. In your case, we suggest to perform a simle test where Replicate creates the table in the target. If it will exceed the target maximum row size you'll get a message in the log.
max row limit should be good enough.
The question seems to be more a problem created by the target database for which the answer can only be provided by the source database and as such has nothing at all to do with Replicate which is 'only the messenger' - which will possibly be shot by the target.
As you consider a solution to sum up the column sizes for a source database table, be sure the consider the impact of bytes vs characters where a single character may sometimes take up to 4 bytes, but typically will not. The target limitation is bytes, best I know.
Each source DB has its own solution for the question. If the source is Oracle it is simple. Just query dba_tab_columns or user_tab_columns for data_length with a 'where' on table_name as needed.
wholesale Oracle solution: select table_name "Table Name", sum(data_length) "Max bytes" from dba_tab_columns group by table_name;
Some database may not have this data ready for query. Just generate a table definition and feed it to a script to make the addition?
Or just run the task and see whether it fails or not, giving feedback to you design team to review the critical minimal data on failure.
What is the source DB being used? Maybe a reader here has a function/query/script handy for you! 'stackexchange' probably has. Did you try there?
Hope this helps,
Hein
The aproximate row size in the screen shot provided refers to the row size in the source. There is no similar information related to target endpoint. In your case, we suggest to perform a simle test where Replicate creates the table in the target. If it will exceed the target maximum row size you'll get a message in the log.
Good Day!
Upon checking, each source retrieve the tables metadata. Replicate engine calculate from that the row size
Please note that this is an assumption based on the metadata, and not the actual size
thanks
Lyka
max row limit should be good enough.
The question seems to be more a problem created by the target database for which the answer can only be provided by the source database and as such has nothing at all to do with Replicate which is 'only the messenger' - which will possibly be shot by the target.
As you consider a solution to sum up the column sizes for a source database table, be sure the consider the impact of bytes vs characters where a single character may sometimes take up to 4 bytes, but typically will not. The target limitation is bytes, best I know.
Each source DB has its own solution for the question. If the source is Oracle it is simple. Just query dba_tab_columns or user_tab_columns for data_length with a 'where' on table_name as needed.
wholesale Oracle solution: select table_name "Table Name", sum(data_length) "Max bytes" from dba_tab_columns group by table_name;
Some database may not have this data ready for query. Just generate a table definition and feed it to a script to make the addition?
Or just run the task and see whether it fails or not, giving feedback to you design team to review the critical minimal data on failure.
What is the source DB being used? Maybe a reader here has a function/query/script handy for you! 'stackexchange' probably has. Did you try there?
Hope this helps,
Hein