Do not input private or sensitive data. View Qlik Privacy & Cookie Policy.
Skip to main content

Announcements
Qlik and ServiceNow Partner to Bring Trusted Enterprise Context into AI-Powered Workflows. Learn More!
cancel
Showing results for 
Search instead for 
Did you mean: 
JacobTews
Creator
Creator

Help me manage storage on our analytics server?

Hello, dear community!

We are running on-prem Replicate, reading data from a DB2 iSeries (AS400) transactional database and writing to a SQL Server database which then serves as the source for analytics (directly and via transformed and modeled data in the data warehouse).

I am encountering storage issues; basically Compose scheduled workflows are failing because the LOGFILE is full. I have truncated the file for the four applicable databases, but am now wondering if I can free up space another way, as this is insufficient.

The two largest tables in the database are __ar archive tables, which Compose writes after bringing data into the data warehouse. I believe this is due to the nature of updates at the source; from my understanding, the DB2 updates a row one column at a time, so if a row in table XYZ has 15 columns that are getting updated, that's 15 new rows in the __ar table. The result is an order detail table with 3M rows which has a corresponding archive table of 157M rows, and an inventory table with 100k rows with a corresponding 149M-row archive.

After those, though, the next largest in size are mostly __ct tables with transactional information, for tables which exist in the Replicate target database but not in the data warehouse.

  1. Is there a way to change the Replicate settings so that table XYZ getting a 15-column update only results in 1 new row in the __ct (and eventually the __ar) table? Or is this contrary to the very nature of CDC?
  2. What would happen to the __ct tables if I changed the Change Processing > Store Changes Settings > Store Changes Processing to OFF? Does this mean I lose the ability to have a time-stamped __ar table for the tables in the data warehouse?

I hope I'm explaining this properly, but please let me know if anyone needs further details to help out. I am trying to avoid the need to keep going back to our Director of Infrastructure to request more storage on the analytics server each time we run into the cap (he was hesitant to even allocate the 2TB we have, but I think he'd be amenable to an increase if we have clear plans in place for managing storage). Thanks in advance for your insights and ideas!

Qlik Replicate Qlik Compose Qlik Compose for Data Warehouses 

 

 
Labels (4)
2 Replies
john_wang
Support
Support

Hello @JacobTews ,

In general the warehouse is configured with enough free spaces to store all the ODS/HDS data. For your scenarios, please please open a support ticket , our support team will be more than happy to assist you.

Regards,

John.

Help users find answers! Do not forget to mark a solution that worked for you! If already marked, give it a thumbs up!
JacobTews
Creator
Creator
Author

Thanks, @john_wang! I think that makes sense, that in principle the DW should be configured to support all the historical data needed by the business.

I will open up a ticket to get further guidance on the two ideas I proposed in the question.