Unlock a world of possibilities! Login now and discover the exclusive benefits awaiting you.
Good morning community,
a customer asked why there is a row limit of 10.000 rows for a DataSet in Talend Data Preparation (6.3.1). I have found the setting dataset.records.limit in application.properties but I'm wondering what the impact is when we increase that limit to 100.000 rows. What is the expected amount of memory necessary for 100.000 rows for DataPrep?
Thanks in advance!
Best regards,
MK
Hi Mirco,
The sample size is set to 10.000 rows by default so that we keep the UI fairly responsive in all situations. The more you will increase the value, the less responsive the UI will become. I can hardly come up with accurate guidelines on what would be a "good" value: it depends on many factors, such as:
Back to your question on the expected amount of memory required, you can see how that depends on the number of columns, in addition to the number of rows in the sample.
So the simplest way to go is to give it a try with 100.000 rows and see how the product behaves.
Additional points to consider:
Regards,
Gwendal
Hi Mirco,
The sample size is set to 10.000 rows by default so that we keep the UI fairly responsive in all situations. The more you will increase the value, the less responsive the UI will become. I can hardly come up with accurate guidelines on what would be a "good" value: it depends on many factors, such as:
Back to your question on the expected amount of memory required, you can see how that depends on the number of columns, in addition to the number of rows in the sample.
So the simplest way to go is to give it a try with 100.000 rows and see how the product behaves.
Additional points to consider:
Regards,
Gwendal
Hi Gwendal,
thanks for the clear and fast response! We will give it a try with 100.000 rows!
Best regards,
MK