Unlock a world of possibilities! Login now and discover the exclusive benefits awaiting you.
Hi.. I am newbie in talend.
I am trying to migrate data from table to another table in the same database. Task that I need to do :
1) Update - if ic_no(in pesara) is exist
2) Insert new row - if ic_no is not exist
I've already set the action dataout for update and dataInsert for only insert new row. I am not sure if my tMap setting is correct.
Total of data that need to migrate is 150k, and total for current output databse is 240k.. I am not sure if the amount of data effect the job. I appreciate if someone can help me on this.
Thank you in advance
Can you share the below information:
1) From where you are running the job, studio or server ?
2) if studio, how much memory you have on studio machine & how much you have allocated ?
3) When you ran the job, what is the JVM parameter setting : Xmx & Xms?
You are getting issue because while joining in tmap or extracting the data, it is not fit into the memory of your machine from where you ran the job.. by increasing the memory will solve the problem: You can change the Xmx parameter to bit high limit to test it out.
Hi !
Ok so first, i guess you can simplify your job like this :
pesera ---row1(Main) --- tMap_1 --- dataout --- ost_ahli --- row3(main) --- tlogRow_1
In your tMySqlOUTPUT component you got an option on data Action, and you have the choice to get "Update or Insert". You have to define a key and if the key already exist, the component update, else the component insert.
Then if you got some memory problem, you can define on tMap option (advanced setting), the max memory size. Try to play with it and see if it's better or not. Then you got other option like tMysqlOuputBulk (or bulkExec), never use but it's good for a big data volume.
Good luck
SGV
Thank you for your reply.. I've tried the design before this but all data has been insert as new row even the ic_no is exist.
This is how I set up the query. Action on data is set 'update or insert'
1) I am running from my server (from my localhost)
2)
3) When you ran the job, what is the JVM parameter setting : Xmx & Xms?
Is there any other way that I can do like limit the data per execute..
Hmmm ...
So if you run this sql on your db, you have some value superior than 2 ?
select count(ic_no)
from ost_ahli
group by ic_no;
(the component update EVERY column. So take care of null column)
yes.. there are 2 some row that has more than 2. I got an error like this..
I noticed that the id column is 0 for all data rows.. column id is set as PK in ost_ahli.
If I uncheck the column as a key then the data will not insert or update.
Hmmmm....
When you uncheck key, you can't have duplicate PK and if you check, duplicate PK has created ???
Have you got a PK constraint on your table ?
Can you explain what you have to do exactly ?
Have you got data on your output table ? Input table have duplicate Key and you want to merge data in other table?
Sorry for the late reply.
Actually I have two tables
1) ost_table 2) pesara
-------------------------------------------- --------------------------------------
id (PK, AutoInc) name
name ic_no
ic_no updated_by
created_by tel_no
created_dt pesara
updated_by
updated_dt
pesara
tel_no
What I need to do is to migrate the data from 'pesara' into 'ost_ahli' by checking the ic_no. ic_no is ot a PK.
If pesara.ic_no is exist in ost_ahli, then the data will be update. If not exist, insert new row into ost_ahli.
Currently the output data already has 250K++ data.
Ok I will try to help you but i never use talend with MySql DB . And in Oracle there is no autoincrement.
So, what's happen if you delete id from your ost_table ?
because in MySql when you got an id autoincrement, when you add a line, you don't care about the id. . .
First let's try to only insert with a good incremental PK .
Good luck,
SGV