Skip to main content
Announcements
See what Drew Clarke has to say about the Qlik Talend Cloud launch! READ THE BLOG
cancel
Showing results for 
Search instead for 
Did you mean: 
wangbinlxx
Creator
Creator

Redirect log file for tRedshiftConnection to standard output.

Hi

I upgrade Talend Jobs from 6.3.1 to 7.1.1 . I have an issue with tRedshiftConnection. 

 

In  6.3.1

0683p000009M2U1.jpg

After import it to 7.1.1 , it automatically adds log file, and I cannot remove log file.

0683p000009M2U6.jpg

The job will fail if I run it on the Remote Engine, since it's on linux.

 ### Exception in component tRedshiftConnection_1  java.io.FileNotFoundException: C:/var/local/talend/source_projects/TC_V711/redshift-jdbc.log (No such file or directory) at java.io.FileOutputStream.open0(Native Method)

 

I can add a new context variable, and set it differently for dev and QA/Prod. However, I need to modify all jobs, which is not ideal. 

 

My questions:

1. It doesn't include "log file",  in the doc link https://help.talend.com/reader/1QU27dRAgeVTpWb4Bt3FMg/t2ya9FBdk8lMNyD9~6VClw  . Did I install a wrong version?

 

2. Can I redirect this log to standard output ?

 

Thanks,

Bin

 

 

Labels (4)
4 Replies
Anonymous
Not applicable

Hello,

 

Did anyone resolve above issue?

We have migrated/publish jobs developed in Talend studio 6.x into Talend cloud 7.1.1.

The jobs works just fine in local. However giving below error when run in cloud:

 

 

Failed   2019-05-30 17:35:13

 

tDBConnection_1 C:/Users/narasimha.mlv/Downloads/Talend-Studio-20181026_1147-V7.1.1/workspace/redshift-jdbc.log (No such file or directory)
java.io.FileNotFoundException: C:/Users/narasimha.mlv/Downloads/Talend-Studio-20181026_1147-V7.1.1/workspace/redshift-jdbc.log (No such file or directory)
at java.io.FileOutputStream.open0(Native Method)
at java.io.FileOutputStream.open(Unknown Source)
at java.io.FileOutputStream.<init>(Unknown Source)
at java.io.FileOutputStream.<init>(Unknown Source)
at java.io.FileWriter.<init>(Unknown Source)

 

Just to isolate the issue, we changed both the source and target as Oracle and it worked. No log file issue also. Where as with Redshift it is giving above error.

 

Let us know if you have come across this issue and fixed.

 

Thanks

lgati
Contributor II
Contributor II

I'm using Talend Cloud Big Data Platform and we have the same behavior with the tRedshift component.

What we have done is use a Context variable for the Redshift log, in our local computers it goes to the Talend Workspace, in other environments we put it in /tmp/talend/redshift/logs/redshift-{jobname}.log


Screen Shot 2019-05-31 at 2.44.26 PM.png
Anonymous
Not applicable

Hi,

I tried in that way too but I am still facing the issue.

Is there any other approach to get the log file?

 

Thanks,

Alekhya.

lgati
Contributor II
Contributor II

Did you see the screenshot I attached?

We use a Redshift connection component and drive other components off of that. If you have more than one Redshift component, make sure all of them use the correct Context. This should solve the problem "The file/path could not be found" on your other environments.