Skip to main content
Announcements
Introducing a new Enhanced File Management feature in Qlik Cloud! GET THE DETAILS!
cancel
Showing results for 
Search instead for 
Did you mean: 
Anonymous
Not applicable

Very slow performance in bulk insert into SQL Server Print Show all ar

Hi
We've created some jobs to make bulk inserts ( millions of rows ) between two databases (SQL Server). The performance is very very slow , many of Bulk operations try to insert millions of rows and the average 3000 rows / s .
We wonder why Talend has this behaviour with Bulk Operations , the component run a commit operation for each row ? is there any way to configure bulk process WITH COMMIT every 1000 or 10000 rows ?
In all cases we truncate or drop table on the destination dabase , so, there is not Any index or PK which causes the slow performance.
Thanks by your time .
Labels (2)
4 Replies
Anonymous
Not applicable
Author

Hi,
Would you mind uploading your job design screenshots into forum so that we can get your current situation more precisely.
Best regards
Sabrina
Anonymous
Not applicable
Author

Hi
I've attached an screen. The issue occurs in all our jobs, and it always has been there, but right now it has become in a critical. issue.
In all bulk process (thousands or millions of rows) the performance is very very slow and many process run for a long time. I think that maybe the solution is modifiying any config param on tMap or tMSSqlOutput.
The same bulk process (or similar) are developed in DTSx and run quickly (seconds or minutes). Usually the database (source and destination) is even the same machine.
Cheers

0683p000009MBNh.png 0683p000009MBNm.png 0683p000009MBNr.png
gorotman
Creator II
Creator II

Hi mtbabel,
but I don't see any bulk component in your job (a suggestion: with erased labels it's hard to understand job steps).
There is a tMSSqlOutput (as you said in your second post). This component has the two parameter you need: "Use batch size" and "commit every" ( see here tMSSqlOutput).
For bulk operations you need to use specific components to create the bulk file and that that load this in target table.
See at tMSSqlOutputBulkExec (it does both operations) or these two  tMSSqlOutputBulk and tMSSqlBulkExec .
I hope this help
Bye
Anonymous
Not applicable
Author

Hi
Sorry for the misunderstanding. I didn't say that I've used " Bulk components", I wanted to say that I have a bulk process (but developed with tMSSQLOutput).
In relation with the erased labels, I think that for you my databases's names are not important (it's the only thing you can see on the labels).
My tMSSQLOutput has the parameter "Use Batch Size" checked and with the value 10.000. I don't see the parameter "commit every"  (we're using old version 5.2.1).
I'm going to see how to use the Bulk components.
Thanks by your time and suggestions.
Cheers