Skip to main content
Announcements
UPGRADE ADVISORY for Qlik Replicate 2024.5: Read More
cancel
Showing results for 
Search instead for 
Did you mean: 
seperiya
Contributor
Contributor

Multiple task writing to same Hive Schema

I have a scenario where , I read oracle logs and write them on Log streamer. 

Need to have 2 kinds of tasks for capturing changes and write them on same Hadoop /hive schema. The reason for two tasks is that two types of tables need different partition frequency.

is it recommenced to have more than one tasks writing on hive target? Would it have any impact in control tables or any where else? if this is not the right approach, how can I achieve my requirements?

Labels (1)
1 Solution

Accepted Solutions
Madhavi_Konda
Support
Support

Hi,
You can very well do that using two tasks and there will not be any conflict in terms of control tables and etc.,
All the replicate control tables have task_name,table_owner and table_name fields and therefore the metadata information will not be conflicted in any case.

Thanks,
Madhavi

View solution in original post

1 Reply
Madhavi_Konda
Support
Support

Hi,
You can very well do that using two tasks and there will not be any conflict in terms of control tables and etc.,
All the replicate control tables have task_name,table_owner and table_name fields and therefore the metadata information will not be conflicted in any case.

Thanks,
Madhavi