Do not input private or sensitive data. View Qlik Privacy & Cookie Policy.
Skip to main content

Announcements
Qlik Open Lakehouse is Now Generally Available! Discover the key highlights and partner resources here.
cancel
Showing results for 
Search instead for 
Did you mean: 
Anonymous
Not applicable

Data dump from one database table to other table is very slow in Azure SQL Server

Hi,

 

Why it takes around 11 minutes to dump 2.8 million rows from one database table to other table on Azure Sql server?

Do anyone have solution to speed up the performance on Azure Server?

 

Thanks,

Dhara

 

Labels (3)
12 Replies
ThWabi
Creator II
Creator II

Hi Dhara,

 

the MySQL input component I work with has an advanced setting "enable stream". I do not know about Azure SQL server, maybe this setting is there, too?

Are you using a bulk load component? If you are using an output component with inserts, did you consider to use and ajust the 'batch size' and 'commit every' settings? Have you thought about 'enable parallel execution'?

If you are sending the rows through a tMap, you might want to ajust the 'max buffer size' advanced setting.

 

Best regards,

 

Thomas

 

Anonymous
Not applicable
Author

Hi Thomas,

 

I am only using tMSSqlInput and tMSSqlOutput component. I have tried 'batch size' and 'commit every' settings but still performance is low. I am only executing single job.

 

Thanks,

Dhara

 

dipanjan93
Contributor
Contributor

Check increasing JVM size. Also remove if you have used any tLogRow component within any subjob. 

Anonymous
Not applicable
Author

Hi,

 

    I believe you are using tDBInput and tDBOutput components to load the data. You can do two changes to make it faster.

 

a) Currently you must be using a single tDBInput Component to extract the data. You can use a tparallelize option to run parallel operations where each tDBInput will fetch data from certain partitions. Please make sure that you are providing enough memory also for this operation. This way, we can increase the throughput at reading stage.

b) In writing stage, could you please use tDBBulk components instead of normal output component? This will increase the processing at output stages.

 

Warm Regards,
Nikhil Thampi

Please appreciate our Talend community members by giving Kudos for sharing their time for your query. If your query is answered, please mark the topic as resolved 🙂

Anonymous
Not applicable
Author

Hi,

 

I am getting the following error:

0683p000009M2Xt.jpg

 

Thanks,

Dhara

Anonymous
Not applicable
Author

Hi Dhara,

 

     Could you please share your job details screenshots so that Talend Community members will get more clarity?

 

Warm Regards,
Nikhil Thampi

Please appreciate our Talend community members by giving Kudos for sharing their time for your query. If your query is answered, please mark the topic as resolved 🙂

Anonymous
Not applicable
Author

Hi,

I have set  Additional JDBC parameter = "encrypt=true;trustServerCertificate=false;hostNameInCertificate=*.database.windows.net;loginTimeout=30;" as I am dumping data on Azure Server. Following is my Talend Job:

0683p000009M21q.jpg

 

Thanks,

Dhara

ThWabi
Creator II
Creator II

Hi Dhara,

 

relating to the error "file ... could not be opened": Did this job run in Talend Studio on your local PC or on a remote jobserver? Does the file "mssql_data.txt" exist in the directory D:\<something> (on your PC or the host that runs Talend jobserver)?

 

Best regards,

 

Thomas

Anonymous
Not applicable
Author

Yes I run this file in Talend Studio on my local PC and file is located in local PC on the same path.