Do not input private or sensitive data. View Qlik Privacy & Cookie Policy.
Skip to main content

Announcements
Join us in NYC Sept 4th for Qlik's AI Reality Tour! Register Now
cancel
Showing results for 
Search instead for 
Did you mean: 
AbdullahMastan
Partner - Contributor III
Partner - Contributor III

Stitching a Spark Component to Its Source and Destination

Greetings,

Problem: Stitching a Spark component containing Python scripts to its source and destination (PostgreSQL).

What I have tried:

  1. Created a Spark component and imported its scripts, but I am unable to stitch it to its source and destination.

  2. Created a Data Mapper and manually defined sources, destinations, and transformations. However, this approach is impractical for companies relying on large ELT scripts.

Given the massive Python codebase we have, is there a way to automatically discover the transformations a dataset field(s) has undergone within the script?

OR

What would be a more effective approach to handle this scenario?

#Talend Data Catalog

Labels (1)
  • mdm

2 Replies
AbdullahMastan
Partner - Contributor III
Partner - Contributor III
Author

Hey @Shicong_Hong any comments, or do you know who could help? 

AbdullahMastan
Partner - Contributor III
Partner - Contributor III
Author

The replies from the community on this post are not visible on any of my device. 

All the more, my reply to someone else's reply has also disappeared. 

 

What's going on?