Unlock a world of possibilities! Login now and discover the exclusive benefits awaiting you.
What is QVX_UNEXPECTED_END_OF_DATA? One of our Client is facing this problem, when they load data from Oracle.Please find the Snap Shots Attached
Thanks
Rupam
Solved myself...
Just deleted all tasks on the server and scheduled again.
What is the explanation of this?... I really don't know, but it worked.
Regards,
I just had the same error QVX_UNEXPECTED_END_OF_DATA when loading an Oracle database view and storing it into QVD. In my case the reason was a division by zero that occurred in the view; actually a select directly in the database already caused an error ORA-01476, so nothing wrong with QlikView in my case.
Rgds,
Joachim
I realize that this is an old question but I encountered the same issue today and wanted to share more details as I'm sure others will encounter this in the future.
The problem in my case was that if I run the query in any SQL Query window it would execute fine and would start returning data ... then it would crash on record 300,000+. In QlikView they execute the query and the expectation is one of two things ... Query fails because there is something wrong with it, or the query obviously worked because SQL starts transferring data. QlikView's data extraction keeps pulling packet after packet of data back and suddenly encounters an error.
In my specific case I was doing a DateDiff in SQL to compare two field values. When it encountered a brand new value today of the year 9201 (7,000 years in the future) the query crashes. All QlikView knows is that the data mysteriously ended abruptly.
Hope this helps those 7,000 years in the future who google the search term that leads you to this community post.
Hi,
Restarting of QVS task helped me, but unfortunatelly it does not guarantee the error disappeared.
Best regards,
Maxim
Hi,
I am also exactly same issue coming
please give anyone for this solution
waiting for reply
Error:QVX_UNEXPECTED_END_OF_DATA
Regards,
Khasim.
I just encountered the same issue when using OPENQUERY through a linked server. I found that limiting the number of columns in the query corrected my issue.
I hope this helps!!!
Dan
Just confirming Dalton's observation - this is related to a query failure while returning data from the DBMS. In other words, the query parses correctly, but an expected error occurs after data starts returning.
In my case, running it in Query Analyzer returned an error message:
Msg 512, Level 16, State 1, Line 1
Subquery returned more than 1 value. This is not permitted when the subquery follows =, !=, <, <= , >, >= or when the subquery is used as an expression.
Drilling down further, in my case a subquery in the From clause of my query was returning multiple values where multiple values should never have returned. Once I corrected the data causing the multiple values, the query (and load script) ran fine.
Hi I encountered the same issue while connecting via Rest API connector. Manual reload from desktop worked but QVS task always failed. This is my error message:
2016-07-06 08:51:54 QVX_UNEXPECTED_END_OF_DATA: DNS error
2016-07-06 08:51:54 Error: Custom read failed
2016-07-06 08:51:54 General Script Error
2016-07-06 08:51:54 Execution Failed
2016-07-06 08:51:54 Execution finished.
I might as well join the group...
my error reads:
QVX_UNEXPECTED_END_OF_DATA: SQL##f - SqlState: S1001, ErrorCode: 4294965484, ErrorMsg: [Microsoft][ODBC Microsoft Access Driver]
The query cannot be completed. Either the size of the query result is larger than the maximum size of a database (2 GB), o
r there is not enough temporary storage space on the disk to store the query result.
using ODBC to an ACCESS database- I have 6 columns < 10k rows. This qvw reloads every 15min. out of 10 refreshes on average 4 will fail the rest succeed.
any suggestions? anyone find the answer?
I have just had the error accessing a SQL database.
It seems to be related to bad data. I had a datediff calculation in the SQL and the one field wasn't valid, so it basically couldn't perform the calculation. When I added a WHERE clause to the query and eliminated those bad records, it worked fine. The data extract is over 23 million rows.
This would explain why the query can run perfectly for a long time and then suddenly crash; as well as why it would happen with different data sources. If there is new data that comes into that table which is problematic it would crash.
Hope that helps.