Well, what I do in this case is use Dynamic Update Commands.
SET Result = ActiveDocument.DynamicUpdateCommand ("INSERT INTO * (Country, City) VALUES (DK, Copenhagen), (NO, Oslo)")
if Result = false then
WATCH: 1st go to Document Properties, Server, and check Allow Dynamic Updating Data
And then export to QVD.
set tb = ActiveDocument.GetSheetObject("CH01")
tb.ServerSideExportEx "C:\test.qvd" , ";" , 4 '0=HTML, 1=Text, 2=Bitmap, 3=XML, 4=QVD, 5=BIFF
You'll save the Reload and keep your data on the qvd.
Before exporting, clear your selections. Table will export as it’s shown, with label names included.
Is similar to yours.
If you need it, I´ll send you an example.
Thank you for the quick reply!
I tried your solution but the Insert command did not work. Please, if you don't mind, attach an example.
And regarding exporting to qvd instead of csv, I tried it before but I do not think you can append new rows to an already existing qvd. You van only overwrite it with new data. Appending works with csv.
Please inform me on this matter.
I don´t know how to post a file. They are maintaining the page and it’s not finished yet. Yes, if you need to add the record, use csv. I always prefer qvd. It’s fast, even when there are many records.
SET Result = ActiveDocument.DynamicUpdateCommand ("INSERT INTO TABLE NAME (Field1, Field2) VALUES (" & mVar1.GetContent.String & ", " & mVar2.GetContent.String & ")")
In the macro, you need to pick up the variables first.
Set mVar1 = ActiveDocument.Variables(“Var1”)
If you use macros, you need to find APIGuide in your QV documentation. It’s so useful.
The thing is that is 18:40. This is Spain. I need to go now. Tomorrow more.
I got it to work after your last post yesterday. The INSERT command works perfect and really fast!
I think that my problem before was that I used ' instead of ", or the other way around...
The problem now is how to make it work for multiple items. In other words the user needs to be able to add a comment for multiple Part Nos in one go. So I am thinking that 2 things need to happen (but I am not sure about how):
1. First the possible values in the Part No field need to be parsed through to the macro.
2. The INSERT needs to run once for each item in the parsed Part No list.
For problem 1 I tried the QlikView function
as the mVar2 variable definition, which produces a comma-separated lists of the selected Part Nos. However, it only works actual selected values and not for possible values, and it only work for 6 or less selected values. When I select more than 6 values the variable value changes from a list to something like "10 of 72340". So the
=GetFieldSelections("Part No") does not handle large selections well.
On the 2nd problem I think that maybe a for-loop aroudn the INSERT might work.
I have gone through the API and this is my current solution. I use the function GetPossibleValues to produce the list of items to be commented on. But, it only takes a maximum of 100 items and no more. It seems there are no function that is able to take out all of the possible values.
sub Insert set comment=ActiveDocument.Variables("vPartCommentValue") set listOfPossibles=ActiveDocument.Fields("Part No").GetPossibleValues for i=0 to listOfPossibles.Count-1 ActiveDocument.DynamicUpdateCommand ("INSERT INTO 'Part_comment_1' ('Part No', 'Part comment 1') VALUES (" & listOfPossibles.Item(i).Text & ", " & comment.GetContent.String & ") ") next end sub
Exactly. You’ve been fast. GetPossibleValues only takes 100 values and no more.
QV stores the information in tables, not in variables.
You may create an array, but I´m not sure of the functionality.
Try this other one. 1st, deselects everything. 2nd selects one by one and gets the value.
for i=0 to Valores.GetCardinal-1
Ok, so here are some test results. It seems like the for-loops are quite heavy and that the time required is proportional to to the number of the items in a linear way, roughly.
The whole Part No field that I'm working with have 76000 distinct values which I imagine would take forever to run - at least 7600 seconds (>2 hours)
The script that I used initally is stilll faster:
100 items - 18 seconds
76000 items (the whole field) - 1 min 30 seconds
What's your eperience here? How many items do you run simoultaneously and how long does that take?
Btw, you have been extremely helpful here. Thanks for the friend request as well.
QV is not initially meant to do that. Using macros saves you sometimes but, do they work on the server? And, how they perform?
I used that loop for a client who wanted to create a different pdf per value of a field (Business Unit) and send it attached to every Business Unit, each one its own report in pdf.
There were about 100 BU. It works fine in the server also.
Also, I used once to perform a real incremental reload, never dropping any table. So there was no need to reload qvd.
Even when the time that you took, 18 seconds for 100 records, is real, it does not mean that 1000 records will be 180 seconds. It doesn’t work like that. It’s only an approaching.
If we are talking about 76000 records, I´d think in another way. Are those 76000 records to be updated every day, or it is only the initial situation?
I don´t know what to tell you at this point. They are too many. The first thing coming to my mind is doing the task by parts. Group somehow your records and do 1000 at a time. Divide and you´ll win. I know you don´t want to use intermediate applications like excel.
Using the expression '<url>' & only(your excel doc address) in some table, you can bring (open) the doc from QV, and modify it there. Or adding an action to a textbox.
You're, right. Excel is still the king when it comes to entering comments or other highly unspecified data input. Thanks for sharing your experience though.
Regarding my 76000 records,let me just breifly describe what I'm aiming for with all this. The part numbers are updated quarterly and there are already existising natural groups or categories within them. These groups are ususally determined by dimensions that comes with the source data. For example, there are categories for what factory they are produced in, which design department they belong to, and several facts such as the consumption per factory, forecast volume, etc etc.
Most of these categories works perfectly fine but what they cannot provide is follow-up status. The client uses my application to find business cases within their data, and when they do find interesting things they need to place a comment or tag that says fo example "These part numbers need attention" or "talk to Susan about these ones" or "why are these parts used less in Q4 than Q3?", and so on. When they have examined a business case they need to go back to that certain group, pick up where they started and eventually change the tag to "Done" or "Unsolved" or something else. Today they are doing the follow-up in Excel and switches back and forth between QV and Excel. So they do not have a good workflow process since they do no not see what they have processed and what they have not in QV.
After I tried GetPossibleValues(1000) I realized that GetPossibleValues(76000) would be too heavy. As you said, the variable is huge.
I would like to contribute something I discovered, having problems before.
Before update has to know if you can.
'If the dynamic update is not active does not work, then we set and it is already possible to use update!
set docprop = ActiveDocument.GetProperties