Skip to main content
Announcements
Qlik Introduces a New Era of Visualization! READ ALL ABOUT IT
cancel
Showing results for 
Search instead for 
Did you mean: 
Mahesh3
Contributor
Contributor

Updating GCP table data using Qlik Application Automation

Hello All,

I need to update GCP table rows using automation but there is no block for updating row or update rows in Google Bigquery, So please help me on this.

 

 

 

Thanks in Advance

Labels (2)
1 Solution

Accepted Solutions
Mourcy
Contributor
Contributor

Hello , @Mahesh3  I read your question which is " Updating GCP table data using Qlik Application Automation "

I just want to tell you that i am also user of Qlik Community  Your problem is related to to the Google Bigquery

You can follow some key word i mention below :

 

 

BigQuery supports DML statements that allow you to perform INSERT, UPDATE, and DELETE operations on your data. For updating rows, you can use the UPDATE statement in conjunction with a WHERE clause to target specific rows that need updating.


Here's an example of how you can use DML to update rows:

CODE:

UPDATE `project_id.dataset_id.table_id`
SET column1 = 'new_value1', column2 = 'new_value2'
WHERE condition_column = 'condition_value';

In this example, replace project_id, dataset_id, table_id, column1, column2, condition_column, and condition_value with your actual values.

Keep in mind that DML operations in BigQuery can incur additional costs and have certain limitations, so use them judiciously.

Utilizing Table Substitution (Suggested for Enormous Updates):
Rather than refreshing individual columns straightforwardly, another methodology is to make another table with the refreshed information and afterward supplant the first table with the refreshed one. This strategy is more productive for huge scope refreshes, as it use BigQuery's engineering to improve information capacity.
Here is a bit by bit guide on the most proficient method to involve table swap for refreshes:

Stage 1: Make another table with the refreshed information.
Stage 2: Duplicate the current information into the new table, making the important updates during the duplicate cycle.
Stage 3: Check that the new table contains the right refreshed information.
Stage 4: Utilize the table substitution strategy to trade the old table with the new table.

Remember that table substitution could require some personal time for your information. You might have to change your application to work with the new table after the substitution.

Prior to playing out any updates or huge changes to your information, it's vital to test these procedure on a more limited size or in a test climate to guarantee you figure out the cycle and expected ramifications.

 

I hope you like my answer i always try to give my best.

Thank you,

Best 

Mourcy K. Anderson ]

 

View solution in original post

1 Reply
Mourcy
Contributor
Contributor

Hello , @Mahesh3  I read your question which is " Updating GCP table data using Qlik Application Automation "

I just want to tell you that i am also user of Qlik Community  Your problem is related to to the Google Bigquery

You can follow some key word i mention below :

 

 

BigQuery supports DML statements that allow you to perform INSERT, UPDATE, and DELETE operations on your data. For updating rows, you can use the UPDATE statement in conjunction with a WHERE clause to target specific rows that need updating.


Here's an example of how you can use DML to update rows:

CODE:

UPDATE `project_id.dataset_id.table_id`
SET column1 = 'new_value1', column2 = 'new_value2'
WHERE condition_column = 'condition_value';

In this example, replace project_id, dataset_id, table_id, column1, column2, condition_column, and condition_value with your actual values.

Keep in mind that DML operations in BigQuery can incur additional costs and have certain limitations, so use them judiciously.

Utilizing Table Substitution (Suggested for Enormous Updates):
Rather than refreshing individual columns straightforwardly, another methodology is to make another table with the refreshed information and afterward supplant the first table with the refreshed one. This strategy is more productive for huge scope refreshes, as it use BigQuery's engineering to improve information capacity.
Here is a bit by bit guide on the most proficient method to involve table swap for refreshes:

Stage 1: Make another table with the refreshed information.
Stage 2: Duplicate the current information into the new table, making the important updates during the duplicate cycle.
Stage 3: Check that the new table contains the right refreshed information.
Stage 4: Utilize the table substitution strategy to trade the old table with the new table.

Remember that table substitution could require some personal time for your information. You might have to change your application to work with the new table after the substitution.

Prior to playing out any updates or huge changes to your information, it's vital to test these procedure on a more limited size or in a test climate to guarantee you figure out the cycle and expected ramifications.

 

I hope you like my answer i always try to give my best.

Thank you,

Best 

Mourcy K. Anderson ]