Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 
Not applicable

Stock movements across 400 retail branches

Im new to qv and struggling with the desgn of a stock movement view.

the transactional DB is oracle and has a single table of current stock levels with 50 Million rows (reading the table in QV takes about 1hr 30mins (4.5 GB table). there is no history and we want to analyse stock movement history!

Basic idea was to load table to QVD with a timestamp and analyse with a granularity of 15 minutes. but 1:30 into 15 mins doesnt go !

There are approx 30,000 transactions per hour and these are recorded to a transaction table, but using this table which is much larger would mean calculating deltas ie: Initial stock level - (Sum(movements) in time interval)

Im interested to hear how more experienced developers would handle this ?

There is no traditional data warehouse here (except persisted QVD's) in line with QV's in memory model and to complicate things further my experience is SQLserver and not Oracle a further constraint is that Triggers on the Stock table are ruled out for performance reasons.

All opinions welcome ?

Colin

7 Replies
Not applicable
Author

Colin,

The first table with 50 mil rows you can load only once and then all the transactional data approx 30 k /hr you can append every hour or so, this way 30 k doest not take a lot of time to partial load.

As far as performance is concerned it all depends on what type of KPIs you are looking for.

these r my 2 cents

Talha

Not applicable
Author

The primary table is Current stock of item x in store y. and the transactional data is in a seperate table structure Ie Item X store y qty Sold for each sale item

Any ideas how/where you calculate the Current stock Qty and Movement per time period.

Not applicable
Author

You question is not clear. If you only concerned about the load what I would suggest is

1.Do an initial load of 30000 records from oracle into the QVDs

2. Subsequent load should be increment load. i.e. all insert after the initial load must be captured and updated into Qlikview QVDs.



Not applicable
Author

the Table contains 50m Stock Master rows that have a current stock value constantly updated by all stores. There is no Lastupdated field and a trigger to store and update a lastmodified filed is out of the question for performance concerns (it has been tried before) Also Out is any application change to Maintain Lastmodified as there are multiple Applications/Vendors/devices updating this Stock Master File.

any clever solutions to obtain these stock movement delta's would be appreciated. the current ideas in this area are:

Select minus Queries
Oracle Flashbacks of Row History
Oracle Materialised Views

Not applicable
Author

I want to know if you had a solution for the problem you last posted up there.

I have the same problem with a lot of historical transaction to manage.

Thanks in advance.

richard_chilvers
Specialist
Specialist

Hi Taxaw

Is your problem simply the volume of transaction data?

If so, as earlier answers suggest, you need to build the QVD data only once. After that, update it with new transactions, updated transactions (and deletions if necessary) using an incremental load.

This is described in the documentation and there are also some other explanations on this forum.

HTH

Not applicable
Author

Hi Richard. Thank you for your suggestion, i just see it.

I've already build QVD from my transactionnal once and computed the stock level for each product/warehouse/date.

I had followed OLEG suggestions here http://community.qlik.com/message/79979.

Inventory BalancesAn i had the right result for a granularity of day per product and per warehouse. But the result takes a lot of memory when loading for example a mounth (30 days) of stock in a qlikview document.

My users want to analyse stock history level during a year comparing it to sales within the same period.

Taxaw.

Sory for my poor english.