I have ingested my dashboard data to Chat GPT AI model 3.5-turbo-16k which accepts only 16384 token at a time , it is working good on limited rows around 120 to 150 rows but the actual data is large having around 10 to 15lakh rows, GPT's AI model token limit is getting exceeded.
Is there any other way to ingest large data into GPT AI model .