Unlock a world of possibilities! Login now and discover the exclusive benefits awaiting you.
Hey guys,
I'm currently using a Dynamic schema settings into a tPostgresqlInput in which I get the columns that i want from a query, and in the schema I define that it is dynamic. Just like in the picture below.
What's the problem? When I want to put the results into a tFileOutputDelimited it gets the job done, but when I want to write these results into a tPostgresqlOutput I have the following error, which is an exception of the length for my varchar column that cannot exceed 10485760. I don't know how to put the length of the output columns since they have a dynamic schema.
Has anyone deal with these kind of issues?
Hi again,
Sorry for my delay.
That didn't work for me , I put the length limit into the tDBOutput component for the Dynamic output and I still got the error that I mencioned before.
Anyways, I started to work in a different way and I solved it.
I'll explain what I've done.
first I've made the query that I wanted in a tDBInput component and in the schema I only added one dynamic output, just like in the picture I've shown on the first message. Then, I loaded the data directly into a tFileOutputDelimited which has no problems to load dynamic columns from a query.
I checked on the file and it has exactly all the columns that I needed. Then I did the following job:
Where the tFileInputFullRow reads the first column (headers) of the plain text file that has the dynamic results. Then, in the tJavaRow I made the following:
I created dynamically the query for the creation of the table that I wanted, where I set the length of each column of the table, then I save the query into a context variable and in the tDBrow i just execute the query which results in the creation of the dynamic table.
Finally, I load the data from the dynamic query directly into the table and it has no problem with the lengths of the columns since I defined them before.
Thanks for your help nikhilthampi!!
Saludos!
Hi,
Could you please check what is the maimum limit allowed in the target Postgresql table for varchar column?
It could be because the target table might be having an upper cut off threshold.
Warm Regards,
Nikhil Thampi
Please appreciate our Talend community members by giving Kudos for sharing their time for your query. If your query is answered, please mark the topic as resolved 🙂
Hi,
Thanks for the response!
I can't check that because I have check the option of "Erase table if exists and create" since this table will be created from scratch every time my process begins. So, I don't know anything about the columns or the maximum limit allowed for the columns.
I was thinking and I checked that in a previous part of this whole process I created the a plain text with those values, I think I have to work with that file, like reading the headers and creating the table dynamically.
Hi,
Since you are loading the data to Varchar, set a maximum limit for the Varchar column in the tDBOutput component.
Before passing the data to the DB, use a tMap to measure the length of the data and if it is beyond the allowed maximum limit specified in DB component, reject them and write it to a file for further verification for support team.
Warm Regards,
Nikhil Thampi
Please appreciate our Talend community members by giving Kudos for sharing their time for your query. If your query is answered, please mark the topic as resolved 🙂
Hi again,
Sorry for my delay.
That didn't work for me , I put the length limit into the tDBOutput component for the Dynamic output and I still got the error that I mencioned before.
Anyways, I started to work in a different way and I solved it.
I'll explain what I've done.
first I've made the query that I wanted in a tDBInput component and in the schema I only added one dynamic output, just like in the picture I've shown on the first message. Then, I loaded the data directly into a tFileOutputDelimited which has no problems to load dynamic columns from a query.
I checked on the file and it has exactly all the columns that I needed. Then I did the following job:
Where the tFileInputFullRow reads the first column (headers) of the plain text file that has the dynamic results. Then, in the tJavaRow I made the following:
I created dynamically the query for the creation of the table that I wanted, where I set the length of each column of the table, then I save the query into a context variable and in the tDBrow i just execute the query which results in the creation of the dynamic table.
Finally, I load the data from the dynamic query directly into the table and it has no problem with the lengths of the columns since I defined them before.
Thanks for your help nikhilthampi!!
Saludos!
Good work!
Could you please mark your reply as the solution for this post? It will help other Talend community members during their reference.
Warm Regards,
Nikhil Thampi
Please appreciate our Talend community members by giving Kudos for sharing their time for your query. If your query is answered, please mark the topic as resolved 🙂
to solve this problem, put a dynamic routine :
routines.system.Dynamic dynamicColumnsTmp = row1.data;
String varchar_value =""; // this is to tell if varchar(255) or varchar.
// loop through the columns
for (int i = 0; i < dynamicColumnsTmp.getColumnCount(); i++) {
//Retrieve the value of the current column inside the Dynamic column
Object obj = dynamicColumnsTmp.getColumnValue(i);
//Retrieve a DynamicMetadata object from the column inside the Dynamic column
DynamicMetadata columnMetadata = dynamicColumnsTmp.getColumnMetadata(i);
if(columnMetadata.getDbType()=="varchar" && columnMetadata.getLength() >= 10485760 ){
// when columnMetadata.getLength() >= 10485760, create function fails with varchar length exceeded error
varchar_value = "" ;
}
else if(columnMetadata.getDbType()=="varchar" && columnMetadata.getLength() < 10485760 ) {
varchar_value ="("+ columnMetadata.getLength() + ")" ;
}
else {
varchar_value ="";
}
if(i < dynamicColumnsTmp.getColumnCount()-1){
statement = statement + "\""+ columnMetadata.getName() + "\" " + columnMetadata.getDbType() + varchar_value + " " + null_or_not + ",";
}
else {
// last column will not have comma at the end
statement = statement + "\""+ columnMetadata.getName() + "\" " + columnMetadata.getDbType() + varchar_value + " " + null_or_not ;
}
statement = statement + ");";
context.query = statement;