Skip to main content
Announcements
SYSTEM MAINTENANCE: Thurs., Sept. 19, 1 AM ET, Platform will be unavailable for approx. 60 minutes.
cancel
Showing results for 
Search instead for 
Did you mean: 
Anonymous
Not applicable

tgooglebigqueryinput delete data

I want to delete data from google big query using delete statement on input element. I get error, but in google bigquery data is deleted. Can you explain why this error happens and how to overcome it? i am using Talend Open studio for data integration 7.2.1.20190131_1157-M2

 

Exception in component tBigQueryInput_2 (criteo_data_import)
com.google.api.client.googleapis.json.GoogleJsonResponseException: 400 Bad Request
{
"code" : 400,
"errors" : [ {
"domain" : "global",
"message" : "API limit exceeded: Unable to return a row that exceeds the API limits. To retrieve the row, export the table.",
"reason" : "apiLimitExceeded"
} ],
"message" : "API limit exceeded: Unable to return a row that exceeds the API limits. To retrieve the row, export the table.",
"status" : "INVALID_ARGUMENT"
}
at com.google.api.client.googleapis.json.GoogleJsonResponseException.from(GoogleJsonResponseException.java:145)
at com.google.api.client.googleapis.services.json.AbstractGoogleJsonClientRequest.newExceptionOnError(AbstractGoogleJsonClientRequest.java:113)
at com.google.api.client.googleapis.services.json.AbstractGoogleJsonClientRequest.newExceptionOnError(AbstractGoogleJsonClientRequest.java:40)
at com.google.api.client.googleapis.services.AbstractGoogleClientRequest$1.interceptResponse(AbstractGoogleClientRequest.java:312)
at com.google.api.client.http.HttpRequest.execute(HttpRequest.java:1049)
at com.google.api.client.googleapis.services.AbstractGoogleClientRequest.executeUnparsed(AbstractGoogleClientRequest.java:410)
at com.google.api.client.googleapis.services.AbstractGoogleClientRequest.executeUnparsed(AbstractGoogleClientRequest.java:343)
at com.google.api.client.googleapis.services.AbstractGoogleClientRequest.execute(AbstractGoogleClientRequest.java:460)
at pigu_robertas.criteo_data_import_0_1.criteo_data_import.tBigQueryInput_2Process(criteo_data_import.java:12122)
at pigu_robertas.criteo_data_import_0_1.criteo_data_import.tFileInputDelimited_6Process(criteo_data_import.java:11680)
at pigu_robertas.criteo_data_import_0_1.criteo_data_import.tFileInputDelimited_5Process(criteo_data_import.java:10045)
at pigu_robertas.criteo_data_import_0_1.criteo_data_import.tFileInputDelimited_4Process(criteo_data_import.java:8410)
at pigu_robertas.criteo_data_import_0_1.criteo_data_import.tBigQueryInput_1Process(criteo_data_import.java:5893)
at pigu_robertas.criteo_data_import_0_1.criteo_data_import.tFileInputDelimited_3Process(criteo_data_import.java:5352)
at pigu_robertas.criteo_data_import_0_1.criteo_data_import.tFileInputDelimited_2Process(criteo_data_import.java:3939)
at pigu_robertas.criteo_data_import_0_1.criteo_data_import.tFileInputDelimited_1Process(criteo_data_import.java:2526)
at pigu_robertas.criteo_data_import_0_1.criteo_data_import.tJava_1Process(criteo_data_import.java:1145)
at pigu_robertas.criteo_data_import_0_1.criteo_data_import.runJobInTOS(criteo_data_import.java:28113)
at pigu_robertas.criteo_data_import_0_1.criteo_data_import.main(criteo_data_import.java:27942)

Labels (5)
1 Solution

Accepted Solutions
nfz11
Creator III
Creator III

I am glad you figured it out.  Please select a comment as a solution so this topic is marked as solved.


@Ronertas wrote:

So i have changed my GBQ partioned table to wildcard tables and now everything is working

View solution in original post

23 Replies
nfz11
Creator III
Creator III

I am not too familiar with this API, but it looks the like the data in the response is exceeding the maximum size as defined in the API.  Is it possible the delete endpoint returns the row that is deleted and the data in that row is quite large?

Anonymous
Not applicable
Author

I fink it is a bit more complicated. if i add where clause like 1 = 1 to delete full table data, where is no error, but if i select 1 day then i get this error. Also data is not big. full table is less then 10mb.

nfz11
Creator III
Creator III

I did a Google search for the error message and found this
https://stackoverflow.com/questions/37547711/bigquery-api-limit-exceeded-error

I don't think it's related to Talend but a known issue with Google big query.
Anonymous
Not applicable
Author

But it is 2016 information and error 403. In my case error is 400.

Also if i launch same query in GBQ UI it finishes without errors

Anonymous
Not applicable
Author

Hi Ronertas,

 

Could you share the screenshot of your job ? Share us the query you use for better understanding ? Because the error says , Unable to return a row that exceeds the API limits, but you say the table is less than 10 MB. It should work ! lets find where the mistake is !

 

Thanks & Regards,

Prabuj

Anonymous
Not applicable
Author

Query:

"#standardSQL \n\r delete FROM `sonorous-mix-211210.marketing_platforms_data.criteo_transactions` where cast(transactionDate as date) BETWEEN date_add(current_date('+03:00'), interval -2 day) and date_add(current_date('+03:00'), interval -2 day)"

 

it deletes about 2000 rows.

 

i have similar query but it deletes only 3 rows and it do not show this error


job.PNG
Anonymous
Not applicable
Author

Alright, so you use the #standardsql here - check in the talend tbigqueryinput designer section whether the authentication mode & code are perfect to the query , as well as check whether you untick the use legacySQL and try one more time.

Lets c if it works out (or) change the result size 

 

0683p000009M5rN.jpg

Thanks,
Prabuj.

Anonymous
Not applicable
Author

If it doesn't work out still, You can use tRESTClient definitely for any REST API CALL, and use with OAuth authorisation token.

 

Thanks,

Prabuj 

 

* Don't forget to give Kudos *

 

Anonymous
Not applicable
Author

I am using authentication mode  OAuth 2.0. Code is correct. And yes: use legacy SQL is  unchecked.

I tried to delete #standardsql, but same error.