Skip to main content
Announcements
Introducing Qlik Answers: A plug-and-play, Generative AI powered RAG solution. READ ALL ABOUT IT!
cancel
Showing results for 
Search instead for 
Did you mean: 
dgreenx
Creator
Creator

Issues with Snowflake commonents I have noticed as of 2022-02 update to TStudio

Here are some things I have noticed with Snowflake components that are odd and require work-around coding:

1) tSnowflakeRow or tDBRow(Snowflake): When you use it, it says, with a warning icon, that you should provide an output 0695b00000OCusYAAT.pngYou can have it Guess the Schema of the output and it will sometimes create one. That's nice. BUT YOU WILL NOT GET ANY OUTPUT, so that sucks, because you might would like to capture the number of records you just deleted, for example. But another irritating thing is that the warning is there in the first place when it does not need to be, since you don't need to always capture the output (which is nothing anyway) and the Warning Icon is IN THE WAY if you provide Documentation and would like the little blue icon to appear when you use the Show Information selection.

0695b00000OCuuZAAT.pngMaybe there is a reason for this behavior, but I don't know what it is.

2) tSnowflakeInput or tDBInput(Snowflake): When you drag the Row Main (or select Row Main) to the tSnowflakeOutput, or tDBOutput(Snowflake), it will not prompt you, like other t___Input connectors do, to use the schema of the output connector. Work-around is to open the output connector and copy the schema to the input connector there. Four or five clicks, but wholly unnecessary if it prompted, like the other input connectors do. This works 99.9 percent of the time, but I have one where I used the output schema as the input schema and it failed on a datetime field in the snowflake read. So I just did a guess schema on the input and put a tMap between them. All the fields are the basically identical so I just automapped it and it worked. Still scratching my head with that one. Maybe there is a reason for this no prompting behavior, but I don't know what it is.

3) tSnowflakeOutput or tDBOutput(Snowflake): This one is a bit bizarre. If you have fields that are quoted in the Snowflake table creation, for example you need or prefer to put spaces in field names so you wrap the field in double quotes at table creation), the output process of the connector goes haywire and starts dropping columns (you see this on the Snowflake side in the queries from your Talend user id for the connection). And then the Talend job starts rapidly consuming memory until it hits the ceiling and then it bombs. The work-around I have found is to just forget about using quoted field names in tables. Maybe there is a reason for this behavior, but I don't know what it is.

Please feel free to add your experiences with Snowflake connectors, as well.

Thanks,

dg

Labels (3)
2 Replies
dgreenx
Creator
Creator
Author

One other thing I forgot:

In tMap, when you make changes to the schema of the inbound file, it does not change the schema in the tSnowflakeInput, or tDBInput(Snowflake) connector. I reported this behavior on tMap for other connectors inbound to tMap back in version 7.x and I thought they were going to fix it, but it looks like it has not been fixed. Work-around is to change the schema only in the t____Input connector, not just in the tMap component.

 

Thanks,

dg

Anonymous
Not applicable

@david green​ , thanks for your great feedback. I have tested the issues (#1, #2,#4) mentioned by you and I can reproduce the issues, can you please create an issue under Component project (COMP) on Talend Bugtracker for our R&D team to fix them.

 

Regards

Shong