Unlock a world of possibilities! Login now and discover the exclusive benefits awaiting you.
 
					
				
		
Starting job consumer at 22:56 19/03/2020. [statistics] connecting to socket on port 3845 [statistics] connected Job Started. Job running with context = 'development' Exception in thread "main" java.lang.NoSuchMethodError: com.google.common.util.concurrent.MoreExecutors.directExecutor()Ljava/util/concurrent/Executor; at com.google.api.gax.retrying.BasicRetryingFuture.<init>(BasicRetryingFuture.java:77) at com.google.api.gax.retrying.DirectRetryingExecutor.createFuture(DirectRetryingExecutor.java:73) at com.google.cloud.RetryHelper.run(RetryHelper.java:73) at com.google.cloud.RetryHelper.runWithRetries(RetryHelper.java:51) at com.google.cloud.bigquery.BigQueryImpl.create(BigQueryImpl.java:212) at com.google.cloud.bigquery.BigQueryImpl.create(BigQueryImpl.java:187) at project.job_version.job.tStatCatcher_1Process(job.java:12389) at project.job_version.job.tGSPut_1Process(job.java:10671) at project.job_version.job.tKafkaInput_1Process(job.java:10537) at project.job_version.job.tKafkaConnection_1Process(job.java:5626) at project.job_version.job.tJava_1Process(job.java:5502) at project.job_version.job.tPrejob_1Process(job.java:5371) at project.job_version.job.runJobInTOS(job.java:12865) at project.job_version.job.main(job.java:12581) [statistics] disconnected [statistics] disconnected [statistics] disconnected [statistics] disconnected [statistics] disconnected [statistics] disconnected Job job ended at 22:57 19/03/2020. [Exit code = 1]
I created a job that connects to a Kafka topic and outputs the file to CSV.
The job then uploads the CSV file to Google Cloud Storage.
Also, in the same job, I have the tAssert, tFlowMeter, tLogCatcher, and tStatsCatcher components mapped to BigQuery output components.
When I disable only the CSV upload to GCS, the job runs successfully.
When I disable only the log components, the job runs successfully.
When both are enabled I receive the error listed above.
Has anyone run into this or a similar issue?
 
					
				
		
Same here.
Having Bigquery alone it´s ok.
Having Drive alone it´s ok.
Having both, does not work.
 
					
				
		
 
					
				
		
 
					
				
		
Same here.
Having Bigquery alone it´s ok.
Having Drive alone it´s ok.
Having both, does not work.
 
					
				
		
Thanks for the feedback. Hopefully there's an enhancement on the feature backlog that would allow multiple Google-related components to be added within a single job.
Do you think a potential workaround is to leave the BigQuery logging components in a parent job and create a sub-job that contains the Google Cloud Storage component?
 
					
				
		
