Unlock a world of possibilities! Login now and discover the exclusive benefits awaiting you.
Hi Gurus,
I am trying to go from 1 table in source to Many tables on the target database. source is SQL and target is Oracle. What is the are some of the patterns to achieve this? The table schemas are different b/w source and target.
Thanks!
-
Hi Dumma,
It sounds like a log stream task will allow you to have a single task reading the source table and then many replication tasks that would move the data to the target.
Please have a look at this article as it will get you going in the right direction.
Log Stream tasks
https://community.qlik.com/t5/Knowledge/LogStream-Setup/ta-p/1743657
Thanks,
Michael
Hi Dumma,
It sounds like a log stream task will allow you to have a single task reading the source table and then many replication tasks that would move the data to the target.
Please have a look at this article as it will get you going in the right direction.
Log Stream tasks
https://community.qlik.com/t5/Knowledge/LogStream-Setup/ta-p/1743657
Thanks,
Michael
Hi @Dumma
Your best bet would probably be to use Log Stream. One task would replicate the table to binary files on disk, then you would set up a task for each table you need on the target that reads the binary data and writes to the target.
Here's a link to our documentation on that feature:
Thanks,
Dana
Thanks @Dana_Baldwin @Michael_Litz
I will look at logstream to do this. Can the target tables have a different primary key than the source table primary keys?
Hi @Dumma ,
yes, you can select the required primary key columns in the table transformation.
Thanks,
Swathi
Hi Dumma,
Please do mark one of the posts as Accept as Solution if you find it helpful.
Thanks,
Michael
Hi All,
I don't have anything new to add here but just summarizing the details so that the steps are clear and we follow the best practice.
1) Create Logstream Staging task for which the source is SQL Server and target is Logstream endpoint. This will ensure we are reading source table only with one connection to avoid bottleneck.
2) Create Oracle endpoint and choose a user/schema which has access to all other schemas. This will avoid redundancy and maintenance of multiple endpoints.
3) Start building replication tasks each connecting to the same Logstream as the source and target Oracle endpoint. For each task either create a global transformation to choose the right schema or if it's only one table per task then choose the schema in the table settings, example below:
4) Start the Logstream parent and the replication tasks in sequence and ensure all tasks are running in parallel and as expected.
Thank you,
we tried different PKs on the target, however we noticed that we got two inserts and then update went to second insert, the update should have gone to the first insert.