Unlock a world of possibilities! Login now and discover the exclusive benefits awaiting you.
 Pranita123
		
			Pranita123
		
		
		
		
		
		
		
		
	
			
		
		
			
					
		Hello,
Encountering the following error while loading data from Oracle as the source to HDFS as the target:
"Handling the end of table 'db.tablename' loading failed by subtask 1, thread 1.
Failed to handle a special table.
Failed to execute the 'insert into select' command. Return Code: SQL_ERROR, SqlState: HYT00, NativeError: 72. Message: [Cloudera][Hardy] (72) Query execution timeout expired. Failed (retcode -1) to execute the statement: INSERT INTO TABLE db.tablename SELECT field1, field2 FROM db.tablename_att_tmp"
Task Details: Full load + Store changes.(target file format: Parquet)
Could you please provide some insight into the reasons behind this issue?
Regards,
Pranita
 john_wang
		
			john_wang
		
		
		
		
		
		
		
		
	
			
		
		
			
					
		Hello @Pranita123 ,
You are right - there is not the cdcTimeout and loadTimeout in internal parameter for Hadoop target endpoint. Only executeTimeout is available, and it should help.
BTW, please check the Hadoop side resources usage, the timeout occurs while the temporary table data merge to final target table by executing the 'insert into select' command, this is operations inside Hadoop cluster, timeout may occurs while lack of resources eg CPU and Memory etc.
Regards,
John.
 aarun_arasu
		
			aarun_arasu
		
		
		
		
		
		
		
		
	
			
		
		
			
					
		Hello @Pranita123 ,
Thanks for reaching out to Qlik community
It appears that you're encountering a timeout issue while loading data from Oracle to HDFS with the target file format set to Parquet. To address this, please follow these steps:
Stop the Task
Target Endpoint Settings:
cdcTimeout (default 600): Set to 12000 to allow for a longer timeout period.executeTimeout (default 60): Set to 1200 to extend the timeout duration.loadTimeout (default 1200): Set to 24000 to accommodate larger data transfers.Resume Task:
Below is the reference article
https://community.qlik.com/t5/Official-Support-Articles/Qlik-Replicate-Query-timeout-expired/ta-p/17...
Regards
Arun
 aarun_arasu
		
			aarun_arasu
		
		
		
		
		
		
		
		
	
			
		
		
			
					
		Hello @Pranita123 ,
Additionally, I kindly request you to check the resource availability on your target database to ensure it has sufficient resources to execute the query effectively. Furthermore, I suggest trying to manually execute the query on the target database and observing the time it takes to complete. This manual execution can provide insights into any potential performance issues specific to the target environment.
Regards
Arun
 aarun_arasu
		
			aarun_arasu
		
		
		
		
		
		
		
		
	
			
		
		
			
					
		If our response has been helpful, please consider clicking "Accept as Solution". This will assist other users in easily finding the answer.
Regards
Arun
 Pranita123
		
			Pranita123
		
		
		
		
		
		
		
		
	
			
		
		
			
					
		Thank you for your prompt response, @aarun_arasu
Actually, we are unable to locate the cdcTimeout and loadTimeout in internal parameter .As I mentioned before, we are using HDFS as the target.
Regards,
Pranita
 john_wang
		
			john_wang
		
		
		
		
		
		
		
		
	
			
		
		
			
					
		Hello @Pranita123 ,
You are right - there is not the cdcTimeout and loadTimeout in internal parameter for Hadoop target endpoint. Only executeTimeout is available, and it should help.
BTW, please check the Hadoop side resources usage, the timeout occurs while the temporary table data merge to final target table by executing the 'insert into select' command, this is operations inside Hadoop cluster, timeout may occurs while lack of resources eg CPU and Memory etc.
Regards,
John.
 Pranita123
		
			Pranita123
		
		
		
		
		
		
		
		
	
			
		
		
			
					
		
@john_wang  Thank you for your response.
We want to clarify whether this temporary table is generated only when we keep the Parquet format in HDFS, or if it is created for all file types ?
 john_wang
		
			john_wang
		
		
		
		
		
		
		
		
	
			
		
		
			
					
		Hello @Pranita123 ,
Not for all file types, for example it's not used for the CSV type table, it's used for file format conversion stage. However I cannot remember clear if it's used for Parquet, or for compression etc. if you need, I can confirm for your tomorrow.
Feel free to let me if you need any additional information.
Regards,
John.
