Do not input private or sensitive data. View Qlik Privacy & Cookie Policy.
Skip to main content

Announcements
Qlik Open Lakehouse is Now Generally Available! Discover the key highlights and partner resources here.
cancel
Showing results for 
Search instead for 
Did you mean: 
Anonymous
Not applicable

Grouping rows, create a group row, and modify values

I've got a bunch of data in a mysql table, that resembles this data.

    | sku        | group_name | description       | size | type   |
    |------------|------------|-------------------|------|--------|
    | EX-1000-XS | NULL       | Long text         | XS   | single |
    | EX-1000-S  | NULL       | Long text         | S    | single |
    | EX-1000-M  | NULL       | Long text         | M    | single |
    | EX-1000-L  | NULL       | Long text         | L    | single |
    | EX-1001-M  | EX-1001    | Another long text | M    | single |
    | EX-1001-L  | EX-1001    | Another long text | L    | single |
    | EX-1001    | NULL       |                   |      | group  |

Now I'm trying to creates row based on grouping that data, and also modify the rows I changed with a proper group_name

    | sku        | group_name | description       | size | type   |
    |------------|------------|-------------------|------|--------|
    | EX-1000-XS | EX-1000    | Long text         | XS   | single |
    | EX-1000-S  | EX-1000    | Long text         | S    | single |
    | EX-1000-M  | EX-1000    | Long text         | M    | single |
    | EX-1000-L  | EX-1000    | Long text         | L    | single |
    | EX-1000    | NULL       |                   |      | group  |

I can do the whole things with a extremely complex SQL query, but my re-use is close to none, and the complexity will get out of hand quickly.  How can a break this thing out in a series of talend component steps?  

Labels (2)
1 Solution

Accepted Solutions
Anonymous
Not applicable
Author

Hi,

 

    If you want to take a list of unique groups, I would suggest to to create one more flows in tMap with same steps to pick group name alone and pass the data to a taggregaterow component and send the output as list of group rows.

 

Warm Regards,

 

Nikhil Thampi

View solution in original post

4 Replies
Anonymous
Not applicable
Author

The initial data is coming in from a MySQL input, which right now has that monstrous SQL query to do the whole transform and then it goes to a tJoin to end at one of two MySQL outputs.  One of them for Updates, and one for Inserts, based on if the SKU exists.  The output I don't need any assistance on, it's the transform I'm not sure how to refactor.

 

Anonymous
Not applicable
Author

Hi,

 

    I have used below file as input.

 

sku        | group_name | description       | size | type   |
EX-1000-XS | NULL       | Long text         | XS   | single |
EX-1000-S  | NULL       | Long text         | S    | single |
EX-1000-M  | NULL       | Long text         | M    | single |
EX-1000-L  | NULL       | Long text         | L    | single |
EX-1001-M  | EX-1001    | Another long text | M    | single |
EX-1001-L  | EX-1001    | Another long text | L    | single |
EX-1001    | NULL       |                   |      | group  |

And below is the output.

0683p000009M1CI.png

 

 

Below is the mapping of tMap0683p000009M1eP.pngtMap

 

 

Below is the function used in tMap where I am trimming the group name based on sku name.

 

row1.type.trim().equals("single")?row1.sku.substring(0, row1.sku.indexOf("-", row1.sku.indexOf("-") + 1))
 :null 

If the answer has helped you, could you please mark the topic as resolved? Kudos are also welcome 🙂

 

Warm Regards,

 

Nikhil Thampi

Anonymous
Not applicable
Author

Hi,

 

    If you want to take a list of unique groups, I would suggest to to create one more flows in tMap with same steps to pick group name alone and pass the data to a taggregaterow component and send the output as list of group rows.

 

Warm Regards,

 

Nikhil Thampi

Anonymous
Not applicable
Author

Hi,

 

If the answer has helped you, could you please mark the topic as resolved?

 

Warm Regards,

 

Nikhil Thampi