We are currently facing a critical issue with a table that replicates data from PostgreSQL. The primary key column, 'ID', which is currently an int, is nearing its capacity as the values are running out of integer range. Our dataset comprises several billions of records, and we need to change the data type of this column to bigint to accommodate further growth. However, changing the datatype directly in PostgreSQL prompts a mandatory reload of the entire dataset. Given the volume of data, this reload could take approximately 10-15 days, during which we would lose significant reporting capabilities—a scenario we are keen to avoid. Could you provide guidance on whether there is a possibility to handle this datatype change directly from the Stitch backend to prevent a full reload? Additionally, are there any strategies or tools available through Stitch that could expedite this process without impacting our data availability for reporting purposes?