Do not input private or sensitive data. View Qlik Privacy & Cookie Policy.
Skip to main content

Announcements
Write Table now available in Qlik Cloud Analytics: Read Blog
cancel
Showing results for 
Search instead for 
Did you mean: 
bhagyarekha
Creator II
Creator II

load csv file data into mysql output table( dynamic table name)

Hi TalendTeam,

Hi i have 61 files each having 80,000 (depends quite changing on daily basies) records and schema of all these files are same.
so i wanted to keep all these files in 61 different tables instead into one as per my company requirments.
so i have given dynamic table name in mysql output component taken from filelist as 
((String)globalMap.get("tFileList_1_CURRENT_FILE"))
but here the problem is when am running with out batch mode and kept line limit as 10000 on run tab,all the records are inserting and comitted

but when am trying to increase the " Number of rows to insert" to 100000 then none of the records are inserting in to the table which is having less number of records.

and tried in auto commit mode also but no result

Appreciate  for your quick response

Regards,
Rekha

Labels (2)
7 Replies
Anonymous
Not applicable

Hello, can you please share your job design once?  
bhagyarekha
Creator II
Creator II
Author

Hi sankalp,
please find the below screen shots

0683p000009MFxE.png 0683p000009MFxJ.png
avinashbasetty
Contributor III
Contributor III

Hi Bhagya,
You could not get reject option when you check the Extend Input in advance settings.
Try to check it and provide the commit size.
0683p000009MFxO.png

Regards,
Avinash.B
8885453419
Anonymous
Not applicable

can you please share screenshot of tMYSQL_Output5 component also?
avinashbasetty
Contributor III
Contributor III

Hi Bhagya,
I am able to run it, Also i checked the Enable Parallel Execution option.

0683p000009MFtr.png

Regards,
Avinash.B
8885453419
Anonymous
Not applicable

This should not be the case. I have done it in past where batch size was more than the number of records coming from source. There is something else that is missing here. 
talendtester
Creator III
Creator III

I think the size of the data which is trying to be committed at a single time is too much at 100K.
Trying putting the insert size to 20K and increasing by 10K till you find where it fails.