Skip to main content
Announcements
See why Qlik was named a Leader in the 2024 Gartner® Magic Quadrant™ for Data Integration Tools for the ninth year in a row: Get the report
cancel
Showing results for 
Search instead for 
Did you mean: 
peter_turner
Partner - Specialist
Partner - Specialist

Does QV effect SQL transaction log size?

Hello All!

Q: Does QV effect SQL transaction log size?

I've seen on 2 machines that the SQL transaction log size had grown to its maximum size, and was effecting the SQL database's ability to store data.
As QV should only be reading data (with SQL select statements) I'm unsure why this might happen.
Has anyone had similar behaviour?

I know a possible solution is to perform database backups as this will clear the transaction log, but i wanted to know why the situation occurred in the first place.

Thanks,
Peter.

1 Solution

Accepted Solutions
chriscammers
Partner - Specialist
Partner - Specialist

In addition to working with our BI tools I occasionally wear the SQL dba hat. So I think I will throw in my two cents.

Depending on the recovery model you are using on your database you must perform regular log and database backups to manage the size of the log.

If you are in a Simple Recovery mode then the log will manage itself but now that your log is large you will have to shrink it to get it back to a more "normal" size.

If you are in Full Recovery mode then you must back up your database and logs in order to manage the log size. Part of the backup process is to remove transactions from the log and move them into the data file.

Generally speaking I would not worry about reads from Qlikview causing the log to grow, what I wonder is do you have maintenance plans running on a regular basis and how much does your data change each day. That will give me the best answer to how your log file got so big. So unless you are dumping this database and reloading from scratch every night or it is read-only then I would make sure to get a maintenance plan running. Eventually your log will fill the disk and then you won't be able to do anything.

Like I said dba is one of the hats I wear so I'm not the necessarily an expert but I've dealt with this situation many time.

Thanks

Chris

View solution in original post

10 Replies
johnw
Champion III
Champion III

If you're logging reads, then yes, QlikView would affect the SQL transaction log size. I believe that in our environment, we only log insert/update/delete, so at least in our environment, that isn't an issue. It sounds like you don't expect that to be an issue in your environment either.

Another thing you may need is FOR FETCH ONLY in your SQL statements in QlikView. In our system, if you don't specify for fetch only, it will lock the records you're reading on the assumption that you may follow up the read with an update. On top of it being a bad idea to take locks to simply read data, it at least seems reasonable that the locks would push the data into your SQL transaction log.

peter_turner
Partner - Specialist
Partner - Specialist
Author

Hi John,

Thanks for the reply, thats very interesting!
Do you have any examples / info on the fetch command as i've not seen this before.

A common statement i might use is
select field1, field2 from table where field1=100
How would this be changed to use fetch?

We're running 2003 server & SQL server 2005.

Thanks,
Peter.

chriscammers
Partner - Specialist
Partner - Specialist

In addition to working with our BI tools I occasionally wear the SQL dba hat. So I think I will throw in my two cents.

Depending on the recovery model you are using on your database you must perform regular log and database backups to manage the size of the log.

If you are in a Simple Recovery mode then the log will manage itself but now that your log is large you will have to shrink it to get it back to a more "normal" size.

If you are in Full Recovery mode then you must back up your database and logs in order to manage the log size. Part of the backup process is to remove transactions from the log and move them into the data file.

Generally speaking I would not worry about reads from Qlikview causing the log to grow, what I wonder is do you have maintenance plans running on a regular basis and how much does your data change each day. That will give me the best answer to how your log file got so big. So unless you are dumping this database and reloading from scratch every night or it is read-only then I would make sure to get a maintenance plan running. Eventually your log will fill the disk and then you won't be able to do anything.

Like I said dba is one of the hats I wear so I'm not the necessarily an expert but I've dealt with this situation many time.

Thanks

Chris

johnw
Champion III
Champion III

Most of my SQL experience is with DB2. When retrieving from DB2, the "for fetch only" goes right on the end like this:

SQL
SELECT
field1
,field2
FROM table
WHERE field1 = 100
FOR FETCH ONLY
;

I don't know if this is applicable to SQL server or not.

Not applicable

For SQL server you will want to use NOLOCK

ie

SELECT fields FROM table (NOLOCK) WHERE whereclause

Of course dirty reads have cons of thier own.

Data may be updated while you are getting data (something that could happen with a query that has a long life cycle).

Phantom records, data that exists due to a transaction (like an insert) but still has the possibilty of being rolled back.

peter_turner
Partner - Specialist
Partner - Specialist
Author

Hello,

It looks like the "for fetch only" is just DB2, but its a good idea.
In the end i suspect the trans log had grown so much because we had been adding large amounts of data with no backups.
Implementing a maintenance plan is a good idea just for the backup process and i'd recommend that for any live QV database system.

One thing to be aware of, if your using the SQL studio management express (the free version), you don't get the maintenance plan option and can only create backups manually.
There is a way around this by getting the TSQL code from the backup wizard window (generate script option), then using the sqlcmd.exe command line program to run your script file, then using a windows task to schedule it.
A long way of doing things but it works at a pinch.
Peter.

Not applicable

BTW : Shrinking / truncate log file when log is full in MS SQL

USE DatabaseName
GO
DBCC SHRINKFILE(<TransactionLogName>, 1)
BACKUP LOG <DatabaseName> WITH TRUNCATE_ONLY
DBCC SHRINKFILE(<TransactionLogName>, 1)
GO

Some people run it as a scheduled task ....

peter_turner
Partner - Specialist
Partner - Specialist
Author

Hi Ondrej,
Thanks for the code, i just gave that a go and works great!
I do have a question regarding the 2 result tables that get displayed, fields such as DbId, CurrentSize etc etc are shown.
Both result tables are the same for me, i'm guessing the first should be the size info before the process runs?

Thanks,
Peter.

Not applicable

Hi Peter,

I'm not sure why you have the size the same. But have a look here :

http://support.microsoft.com/kb/907511

and here :

http://msdn.microsoft.com/en-us/library/aa258824(SQL.80).aspx