Do not input private or sensitive data. View Qlik Privacy & Cookie Policy.
Skip to main content

Announcements
Join us in NYC Sept 4th for Qlik's AI Reality Tour! Register Now
cancel
Showing results for 
Search instead for 
Did you mean: 
Anonymous
Not applicable

Insert large amount of data (97,000,000)

Hi,
I have a source table with nearly 97,000,000 rows which I want to insert into a destination table.
I am using Microsoft SQL Server in my job I use tMSSqlInput to get the source which is wired directly to a tMSSqlOutput to output to the destination table. The job takes ages but the SSIS job which I am replacing runs in around a minute.
I there a more efficient way of doing this?
Thank,
Dave.
Labels (2)
4 Replies
Anonymous
Not applicable
Author

Hi,
If your DB Microsoft SQL Server support for bulk, talend also provide bulk components to deal with large amount of data.
Best regards
Sabrina
Anonymous
Not applicable
Author

Hi Sabrina,
What components would I use from the palette?
Thanks for your help,
Dave.
Anonymous
Not applicable
Author

Hi,
You can see the component references tMSSqlBulkExec and tMSSqlOutputBulk with related scenarios.
Best regards
Sabrina
Anonymous
Not applicable
Author

Thanks Sabrina I overlooked these. The intellisense says "Loads efficiently data from file" I took it literally.
Thanks again 0683p000009MACn.png