Unlock a world of possibilities! Login now and discover the exclusive benefits awaiting you.
I want to delete data from google big query using delete statement on input element. I get error, but in google bigquery data is deleted. Can you explain why this error happens and how to overcome it? i am using Talend Open studio for data integration 7.2.1.20190131_1157-M2
Exception in component tBigQueryInput_2 (criteo_data_import)
com.google.api.client.googleapis.json.GoogleJsonResponseException: 400 Bad Request
{
"code" : 400,
"errors" : [ {
"domain" : "global",
"message" : "API limit exceeded: Unable to return a row that exceeds the API limits. To retrieve the row, export the table.",
"reason" : "apiLimitExceeded"
} ],
"message" : "API limit exceeded: Unable to return a row that exceeds the API limits. To retrieve the row, export the table.",
"status" : "INVALID_ARGUMENT"
}
at com.google.api.client.googleapis.json.GoogleJsonResponseException.from(GoogleJsonResponseException.java:145)
at com.google.api.client.googleapis.services.json.AbstractGoogleJsonClientRequest.newExceptionOnError(AbstractGoogleJsonClientRequest.java:113)
at com.google.api.client.googleapis.services.json.AbstractGoogleJsonClientRequest.newExceptionOnError(AbstractGoogleJsonClientRequest.java:40)
at com.google.api.client.googleapis.services.AbstractGoogleClientRequest$1.interceptResponse(AbstractGoogleClientRequest.java:312)
at com.google.api.client.http.HttpRequest.execute(HttpRequest.java:1049)
at com.google.api.client.googleapis.services.AbstractGoogleClientRequest.executeUnparsed(AbstractGoogleClientRequest.java:410)
at com.google.api.client.googleapis.services.AbstractGoogleClientRequest.executeUnparsed(AbstractGoogleClientRequest.java:343)
at com.google.api.client.googleapis.services.AbstractGoogleClientRequest.execute(AbstractGoogleClientRequest.java:460)
at pigu_robertas.criteo_data_import_0_1.criteo_data_import.tBigQueryInput_2Process(criteo_data_import.java:12122)
at pigu_robertas.criteo_data_import_0_1.criteo_data_import.tFileInputDelimited_6Process(criteo_data_import.java:11680)
at pigu_robertas.criteo_data_import_0_1.criteo_data_import.tFileInputDelimited_5Process(criteo_data_import.java:10045)
at pigu_robertas.criteo_data_import_0_1.criteo_data_import.tFileInputDelimited_4Process(criteo_data_import.java:8410)
at pigu_robertas.criteo_data_import_0_1.criteo_data_import.tBigQueryInput_1Process(criteo_data_import.java:5893)
at pigu_robertas.criteo_data_import_0_1.criteo_data_import.tFileInputDelimited_3Process(criteo_data_import.java:5352)
at pigu_robertas.criteo_data_import_0_1.criteo_data_import.tFileInputDelimited_2Process(criteo_data_import.java:3939)
at pigu_robertas.criteo_data_import_0_1.criteo_data_import.tFileInputDelimited_1Process(criteo_data_import.java:2526)
at pigu_robertas.criteo_data_import_0_1.criteo_data_import.tJava_1Process(criteo_data_import.java:1145)
at pigu_robertas.criteo_data_import_0_1.criteo_data_import.runJobInTOS(criteo_data_import.java:28113)
at pigu_robertas.criteo_data_import_0_1.criteo_data_import.main(criteo_data_import.java:27942)
So i have changed my GBQ partioned table to wildcard tables and now everything is working
I am glad you figured it out. Please select a comment as a solution so this topic is marked as solved.
@Ronertas wrote:
So i have changed my GBQ partioned table to wildcard tables and now everything is working
I have found out real problem why i was getting error. I have had defined table schema. I tired with empty table schema and no errors anymore.