Skip to main content
Announcements
See what Drew Clarke has to say about the Qlik Talend Cloud launch! READ THE BLOG
cancel
Showing results for 
Search instead for 
Did you mean: 
Not applicable

Creating QVDs fails but no error message

I have a script that creates QVDs for another qvw to use.  It currently takes about 22 hours to run.  When it finishes running, the log file looks as though everything completed as expected (shows number of rows written, no errors are shown).  But in QEMC it shows that the task failed, and when I try to load the resulting QVDs into another qvw, I get 0 rows. 

Anyone seen this before or have ideas of what i need to look at?

18 Replies
Not applicable
Author

I'm not sure how to answer your question, and i'm not familiar with the Include statement, so no i'm not using that.

Not applicable
Author

No, no passwords, except on the database connection.

giakoum
Partner - Master II
Partner - Master II

then this is the wrong log you attached

load * from \\na01\qvshare_dev\Make Orders_qvd_test.qvd is the syntax I guess.

if you create 100 rows qvd with debug, can you load that?

jonathandienst
Partner - Champion III
Partner - Champion III

Hi

This what I suggest:

Create a text file on your desktop. Call it more.bat

Edit the file so that it consists of the line:

          more %1

Now drag the qvd file and drop it on the more.bat icon.

You should now see the qvd's XML header in the cmd window.

Scan this for the CreatorDoc, the date and (at the end of the field metadata, the row count and the data lineage of QVD

You can use this information to see when the QVD is being updated and what has been updated. This may assist in diagnosing the problem.

HTH

Jonathan

Logic will get you from a to b. Imagination will take you everywhere. - A Einstein
christian77
Partner - Specialist
Partner - Specialist

Hi.

I think it is a very large file that complicates it all. 22 hours is to much. You run it on monday when you get to work. You see the result on tuesday.

You may want to parallelize your script, which means, divide tasks and run them at the same
time, or one every 15 minutes. If it is only one table, try to divide by some field/s or key field/s.

There is also the incremental reload, and even partial reload.

Good luck.

Not applicable
Author

OK, i removed a 'Where Exists' clause in the load of the 2nd table/QVD which I think was causing the error, because it now loads, although it takes about 30 hours from beginning to distribute.  But now it's so huge it won't render in Access Point or load in developer.  One strange thing i found is that the log file says it's writing 1.1 Billion records to the QVD but when I run a 'SELECT COUNT(*)' against the database, there should only be ~350M records so i don't know why it's loading 3x more rows.  It's just a straight table load with no joins that could be causing duplication.

I'm trying an incremental load now to see how long that will take, and hoping somehow i get a more reasonable number of records.

fabio_vallone
Creator
Creator

Maybe you can change, in server (Documents > Reload) the "Timeout seconds" to 86.400 (=24 hours)

christian77
Partner - Specialist
Partner - Specialist

Hi.

It is impossible to see California and Texas in detail and at the same time.

Those 350 million records have to be seen at the same time? Impossible.

Bring detailed information of the present and try to group the past.

DataBases like AVAYA, call center does that. There is no way I can save the detail for those million calls that I receive every day.

Use Publisher to divide information by section access. Once is published single files are smaller.

There is also DIRECT SELECT from QV 11.2. Data comes on demand of the user. A constant connection is needed. I don't know about performance but I don't think it is very fast.

Simplify your life.

About the 3x more rows, can you send an example of your script? I'm sure somebody in this community will find the error.


Not applicable
Author

The script that is being run is included in the log file I attached above. 

DIRECT SELECT isn't really something we need - we're not looking for live data feeds, in fact we only reload this document weekly.

I would love to be able to summarize the older data on import but i don't know how to do that and still meet the requirement for the report.  We need to measure the effect of coupon use on ordering history.  So the user needs to see for any coupon code(s) selected, what was the average number of orders placed in the 3/6/12 month intervals prior to and after their first use of the coupon.  I don't know how to achieve that flexibility without pulling in all the ordering data for the past 2 years.  I'm open to suggestions on that.