Do not input private or sensitive data. View Qlik Privacy & Cookie Policy.
Skip to main content

Announcements
Discover how organizations are unlocking new revenue streams: Watch here
cancel
Showing results for 
Search instead for 
Did you mean: 
Anonymous
Not applicable

duplicate row when use tLoop with 2 tUnite

hello
when i use only one tUnite after a tLoop i have no strange duplicate row but if i add another tUnite, after the second step the duplicate row appeared?? perhaps the buffer problem
i have join the screenshot of the 2 jobs and it's very simplify, the real job is more complex
in the real job i read database, map a lot of field, make a PLMXML file and execute a script to import data in Teamcenter (PLM of Siemens).
metadata are simple and very little large, but the data file are several thera-byte
in my sample it s not the same output but this explain the problem

output with two tUnite:
 connecting to socket on port 3857
connected
------------------------------------------------------------------------------------------------------
step:1
------------------------------------------------------------------------------------------------------
1one
2two
3three
------------------------------------------------------------------------------------------------------
step:2
------------------------------------------------------------------------------------------------------
1one
2two
3three
1one
2two
1one
2two
3three
------------------------------------------------------------------------------------------------------
step:3

output with one tUnite:
 connecting to socket on port 3674
connected
------------------------------------------------------------------------------------------------------
step:1
------------------------------------------------------------------------------------------------------
1one 2two
3three
------------------------------------------------------------------------------------------------------
step:2
------------------------------------------------------------------------------------------------------
1one 2two
3three
------------------------------------------------------------------------------------------------------
step:3
------------------------------------------------------------------------------------------------------
1one 2two
3three

: pfff i have bad named my job the name "Job loopKO" it s for the job with 2 tUnite (sorry)
and sorry for my bad english 0683p000009MACn.png
0683p000009MEEw.png 0683p000009ME5h.png
Labels (3)
4 Replies
Anonymous
Not applicable
Author

Hi,
The component tUnite is merging data from various sources, based on a common schema without filter.
If you have to use two tUnite components, it suggested that you should use tUniqRow to get the unique result.
Best regards
Sabrina
Anonymous
Not applicable
Author

i can t use tUniqRow because in the real job the row cant filter 0683p000009MPcz.png
i generate normaly a unique string for write my plmxml
if u see the screen in the tTemplate i generate a string:
<header ref="#1 #2 #A #b ..." />

and when i use 2 tUnite i have after the first step
<header ref="#1 #2 #A #b ..." />
<product .../>
<product .../>
<product .../>
<header ref="#1 #2 #A #b #A0 #A1 ... #1 #2 #A #b" />

i can t have 2 headers or other node in my PLMXML to launch my import
i have fortunitly because i can use tMap in this job
normaly in a step i must have 100 rows but with 2 tUnite i have
step1: 100 rows
step2: 200 rows
step3: 300 rows
...
it s anormaly
with a tMap and one tUnite it s ok but in another job i need few tUnite normaly!
I want to know if this is normal or talend bug
if it is normal I must find another way

0683p000009MECr.png
Anonymous
Not applicable
Author

Hi,
It is normal not a bug. Because tUnit just merge data without filter. I think you should find a proper component for your job design, such as, the component tMap. You can set a filter in it according to your requirement.
For more reference, please have a look at the on-line user manual guide tMap operation and tMap Job example.
Best regards
Sabrina
Anonymous
Not applicable
Author

It is normal not a bug. .....

heuu for me it s not normal 0683p000009MACn.png
i have remake my sample job and now in the log i have the step
in the step 1 it s OK
in after i have duplicate row of the result of the tUnite2 and tUnite1 from the first step
in 1000 step i could a java heap space
i mean MERGE flow > 1 is not compatible with ITERATE flow 0683p000009MPcz.png
------------------------------------------
Step=1
------------------------------------------
Step1_row1 <-normal
Step1_row2 <-normal
Step1_row3 <-normal
------------------------------------------
Step=2
------------------------------------------
Step1_row1 <-not normal
Step1_row2 <-not normal
Step1_row3 <-not normal
Step1_row1 <-not normal
Step1_row2 <-not normal
Step2_row1 <-normal
Step2_row2 <-normal
Step2_row3 <-normal
------------------------------------------
Step=3
------------------------------------------
Step1_row1 <-not normal
Step1_row2 <-not normal
Step1_row3 <-not normal
Step1_row1 <-not normal
Step1_row2 <-not normal
Step2_row1 <-not normal
Step2_row2 <-not normal
Step2_row3 <-not normal
Step1_row1 <-not normal
Step1_row2 <-not normal
Step2_row1 <-not normal
Step2_row2 <-not normal
Step3_row1 <-normal
Step3_row2 <-normal
Step3_row3 <-normal
------------------------------------------
Step=4
------------------------------------------

0683p000009MEF1.jpg