Unlock a world of possibilities! Login now and discover the exclusive benefits awaiting you.
I want to delete data from google big query using delete statement on input element. I get error, but in google bigquery data is deleted. Can you explain why this error happens and how to overcome it? i am using Talend Open studio for data integration 7.2.1.20190131_1157-M2
Exception in component tBigQueryInput_2 (criteo_data_import)
com.google.api.client.googleapis.json.GoogleJsonResponseException: 400 Bad Request
{
"code" : 400,
"errors" : [ {
"domain" : "global",
"message" : "API limit exceeded: Unable to return a row that exceeds the API limits. To retrieve the row, export the table.",
"reason" : "apiLimitExceeded"
} ],
"message" : "API limit exceeded: Unable to return a row that exceeds the API limits. To retrieve the row, export the table.",
"status" : "INVALID_ARGUMENT"
}
at com.google.api.client.googleapis.json.GoogleJsonResponseException.from(GoogleJsonResponseException.java:145)
at com.google.api.client.googleapis.services.json.AbstractGoogleJsonClientRequest.newExceptionOnError(AbstractGoogleJsonClientRequest.java:113)
at com.google.api.client.googleapis.services.json.AbstractGoogleJsonClientRequest.newExceptionOnError(AbstractGoogleJsonClientRequest.java:40)
at com.google.api.client.googleapis.services.AbstractGoogleClientRequest$1.interceptResponse(AbstractGoogleClientRequest.java:312)
at com.google.api.client.http.HttpRequest.execute(HttpRequest.java:1049)
at com.google.api.client.googleapis.services.AbstractGoogleClientRequest.executeUnparsed(AbstractGoogleClientRequest.java:410)
at com.google.api.client.googleapis.services.AbstractGoogleClientRequest.executeUnparsed(AbstractGoogleClientRequest.java:343)
at com.google.api.client.googleapis.services.AbstractGoogleClientRequest.execute(AbstractGoogleClientRequest.java:460)
at pigu_robertas.criteo_data_import_0_1.criteo_data_import.tBigQueryInput_2Process(criteo_data_import.java:12122)
at pigu_robertas.criteo_data_import_0_1.criteo_data_import.tFileInputDelimited_6Process(criteo_data_import.java:11680)
at pigu_robertas.criteo_data_import_0_1.criteo_data_import.tFileInputDelimited_5Process(criteo_data_import.java:10045)
at pigu_robertas.criteo_data_import_0_1.criteo_data_import.tFileInputDelimited_4Process(criteo_data_import.java:8410)
at pigu_robertas.criteo_data_import_0_1.criteo_data_import.tBigQueryInput_1Process(criteo_data_import.java:5893)
at pigu_robertas.criteo_data_import_0_1.criteo_data_import.tFileInputDelimited_3Process(criteo_data_import.java:5352)
at pigu_robertas.criteo_data_import_0_1.criteo_data_import.tFileInputDelimited_2Process(criteo_data_import.java:3939)
at pigu_robertas.criteo_data_import_0_1.criteo_data_import.tFileInputDelimited_1Process(criteo_data_import.java:2526)
at pigu_robertas.criteo_data_import_0_1.criteo_data_import.tJava_1Process(criteo_data_import.java:1145)
at pigu_robertas.criteo_data_import_0_1.criteo_data_import.runJobInTOS(criteo_data_import.java:28113)
at pigu_robertas.criteo_data_import_0_1.criteo_data_import.main(criteo_data_import.java:27942)
I am glad you figured it out. Please select a comment as a solution so this topic is marked as solved.
@Ronertas wrote:
So i have changed my GBQ partioned table to wildcard tables and now everything is working
I am not too familiar with this API, but it looks the like the data in the response is exceeding the maximum size as defined in the API. Is it possible the delete endpoint returns the row that is deleted and the data in that row is quite large?
I fink it is a bit more complicated. if i add where clause like 1 = 1 to delete full table data, where is no error, but if i select 1 day then i get this error. Also data is not big. full table is less then 10mb.
But it is 2016 information and error 403. In my case error is 400.
Also if i launch same query in GBQ UI it finishes without errors
Hi Ronertas,
Could you share the screenshot of your job ? Share us the query you use for better understanding ? Because the error says , Unable to return a row that exceeds the API limits, but you say the table is less than 10 MB. It should work ! lets find where the mistake is !
Thanks & Regards,
Prabuj
Query:
"#standardSQL \n\r delete FROM `sonorous-mix-211210.marketing_platforms_data.criteo_transactions` where cast(transactionDate as date) BETWEEN date_add(current_date('+03:00'), interval -2 day) and date_add(current_date('+03:00'), interval -2 day)"
it deletes about 2000 rows.
i have similar query but it deletes only 3 rows and it do not show this error
Alright, so you use the #standardsql here - check in the talend tbigqueryinput designer section whether the authentication mode & code are perfect to the query , as well as check whether you untick the use legacySQL and try one more time.
Lets c if it works out (or) change the result size
Thanks,
Prabuj.
If it doesn't work out still, You can use tRESTClient definitely for any REST API CALL, and use with OAuth authorisation token.
Thanks,
Prabuj
* Don't forget to give Kudos *
I am using authentication mode OAuth 2.0. Code is correct. And yes: use legacy SQL is unchecked.
I tried to delete #standardsql, but same error.