Skip to main content
Announcements
Introducing a new Enhanced File Management feature in Qlik Cloud! GET THE DETAILS!
cancel
Showing results for 
Search instead for 
Did you mean: 
vbadri
Contributor III
Contributor III

Issue with tAzureSynapseOutput and tDBOutput

Hello there,

I'm trying to load some data, using Dynamic column type to synapse tables.

Tried both 'tAzureSynapseOutput' and 'tDBOutput' components and both are performing very slow INSERT only operations, doing around 4.2 rows/sec.

Also seeing an error saying that '

tAzureSynapseOutput' doesn't support 'Dynamic Type' but the documentation (

tAzureSynapseOutput Standard properties • Azure SQL Data Warehouse • Reader • Welcome to Talend Help...) says that it does support

 

"The dynamic schema feature can be used in the following modes: 

Insert UpdateInsert or update Update or insert , and Delete.".

 

 

Does it mean 'Dynamic' data type for the schema?

 

Thanks & Kind Regards

Labels (5)
17 Replies
vbadri
Contributor III
Contributor III
Author

I forgot to highlight another issue here.

 

I've tried the very same components by adding a schema (instead of Dynamic) and the speeds are relatively same.

 

The operation is just INSERT only, I assume just INSERT (like TRUNCATE and LOAD) should be relatively very fast compared to the 'Update, Insert or update, Update or insert, and Delete'

 

Can you please take a look at this issue?

 

Thanks.

COW_WW
Contributor
Contributor

Hello,

I've got similar issue.

I'm using the tAzureSynapseOutput for Insert with Built-In schema and it works extremely slow. Loading 8,760 records takes about 10 minutes which doesn't make sense.

I tried to replace the tAzureSynapseOutput with tFileOutputDelimited just to test the job itself and the load completed in about 2 sec.

Is there any solution for this?

Thanks

Anonymous
Not applicable

Hello,

Could please clarify in which Talend version/edition you are?

When you want to load large, we suggest you use tAzureSynapseBulkExec component.

It supports 2 ways to load large data into Azure synapse:

Best regards

Sabrina

 

COW_WW
Contributor
Contributor

I'm using 8.0.1 version. Build: R2022-03

The load data size is very small - 8,760 records (about 1.5MB).

Anonymous
Not applicable

Hello,

Could you please check box "Die on error" option in tAzureSynapseOutput component to see if there is any more error logged in console?

Best regards

Sabrina

COW_WW
Contributor
Contributor

The "Die on error" option was on from the beginning.

Anonymous
Not applicable

Hello,

Would you mind posting your job design screenshots here which will be helpful for us to address your issue. Please mask your sensitive data.

Best regards

Sabrina

COW_WW
Contributor
Contributor

0695b00000QFR1FAAX.png 0695b00000QFR1yAAH.pngI also tried to load the data directly from the parquet file which is located on ADLS Gen2 and the load speed was still extremely slow:

0695b00000QFR9iAAH.png