Skip to main content
Announcements
Qlik Connect 2024! Seize endless possibilities! LEARN MORE
cancel
Showing results for 
Search instead for 
Did you mean: 
suvbin
Creator II
Creator II

Table got suspended

Hi,

Getting the below error.  What could be issue and resolution for this. This task was running fine before, but getting error today.

Source : IBM DB2 for iseries

Target : Google cloud big query

Full load task

Error: 

 

Waiting on bqjob_r1a9e083a5a097f09_0000018a257cee21_1 ... (0s) Current status: DONE
BigQuery error in load operation: Error processing job 'pa-qlk-production-
dfdfdfd:bqjob_r1a9e083dfdfd00018a257cee21_1': Error while reading data,
error message: Input CSV files are not splittable and at least one of the files
is larger than the maximum allowed size. Size is: 13608161648. Max allowed size
is: 4294967296.
Failure details:
- You are loading data without specifying data format, data will be
treated as CSV format by default. If this is not what you mean,
please specify data format by --source_format. [1020403] (csv_target.c:1012)
00009784: 2023-08-23T23:03:45 [TARGET_LOAD ]E: Failed to wait for previous run [1020403] (csv_target.c:1902)
00009784: 2023-08-23T23:03:46 [TARGET_LOAD ]E: Failed to load data from csv file. [1020403] (odbc_endpoint_imp.c:7776)
00009784: 2023-08-23T23:03:46 [TARGET_LOAD ]E: Handling End of table 'REPDATA'.'QWTYU' loading failed by subtask 2 thread 1 [1020403] (endpointshell.c:3050)
00008860: 2023-08-23T23:03:46 [TASK_MANAGER ]W: Table 'REPDATA'.'QWTYU' (subtask 2 thread 1) is suspended. Command failed to load data with exit error code 1, Command output:
Uploaded 0%...

Labels (2)
18 Replies
SachinB
Support
Support

Hello @suvbin ,

 

From the provided information it looks CSV file is failing while loading to Target endpoint and The error message states that the maximum allowed file size is 4,294,967,296 bytes (about 4 GB), and one of your CSV files is larger than this limit (13,608,161,648 bytes, about 12.7 GB)." 

You can try to increase the max file size at Target endpoint as per the need. Reload the table.

 

Regards,

Sachin B

 

suvbin
Creator II
Creator II
Author

Thank you sachin for the quick response.  I will implement it 

suvbin
Creator II
Creator II
Author

The issue persist again.  Attached the logs.

suvbin
Creator II
Creator II
Author

Attached the logs for your reference

SushilKumar
Support
Support

Hello Team,

It as Advised to Engage TSE via a Support case. Where there is a need of Log analysis. As we always recommend as log may have hold critical information.

Regards,

Sushil Kumar

narendersarva
Support
Support

Hi @suvbin 

Is this a brand new task? did it ever run successfully? if not, based on your number of tables in your task and if there are any LOB/CLOB column size in the task the max file size changes.

But yes, as Sushil mentioned please create a case and our support team can help you on this.

 

Thanks
Naren

DesmondWOO
Support
Support

Hi @suvbin ,

Could you provide
- the value that has been set for the "Max file size(MB):" in the BigQuery endpoint
- The size of e:\\data\\tasks\\WorkdataTar\\data_files\\2\\LOAD00000001.csv.gz

According to the Google Cloud website, the size limit for a compressed CSV file is 4 GB.

Regards,
Desmond

 

 

Help users find answers! Do not forget to mark a solution that worked for you! If already marked, give it a thumbs up!
suvbin
Creator II
Creator II
Author

Yes narender, before the task ran fine.   

The max file size was set to 1000 gb. 

But later included some filtering condition to limit the required records from 900 million to 23 million.  The table loaded successfully. 

But why the issue occured. Is there any limit for data load on bigquery?

 

suvbin
Creator II
Creator II
Author

Hi Desmond,

the value that has been set for the "Max file size(MB):" in the BigQuery endpoint

1000 gb (max as per the endpoint).

--According to the Google Cloud website, the size limit for a compressed CSV file is 4 GB. --  Thank you for this information. 

So what could be the optimal way for this.  Is it splitting the file in 4gb . I mean setting the endpoint 

"Max file size(MB):" set  to 4 gb?

can you please suggest.