Do not input private or sensitive data. View Qlik Privacy & Cookie Policy.
Skip to main content

Announcements
Qlik Open Lakehouse is Now Generally Available! Discover the key highlights and partner resources here.
cancel
Showing results for 
Search instead for 
Did you mean: 
Mimiek
Partner - Contributor III
Partner - Contributor III

Data Gateway 'failed to read data at {0:9}

Hello all,

We have switched to Qlik SAAS since 31/12.

To get the data from our on premise Progress database we are using the Qlik Direct Access Data Gateway.
The thing is that retrieving 1 table does not always succeed, sometimes it does, and after about 10 million lines gives an error message:

'Error:(Connector error: Status(StatusCode="Internal", Detail="Failed to read data at {0:9}") (DirectAccess-1512))'

In total, I am reading 17 fields from a table with about 17.5 million lines, good for a 300 MB QVD file.

There is another larger table that I read from and it works perfectly (QVD of 5 GB). Anyone who can deduce from the error message where my problem is?

Thanks in advance!
Peter

 

Labels (1)
2 Solutions

Accepted Solutions
NadiaB
Support
Support

Hi @Mimiek 

The issue is likely to a value that is not supported.  The data types supported for Postgres are listed here https://help.qlik.com/en-US/cloud-services/Subsystems/ODBC_Connector_help/Content/Connectors_ODBC/Po...

 

If you don't see any specific column that has an invalid data type, best approach would be to narrow down to the value, as you have a lot of values probably best option would be to narrow down to a period (perhaps get data in a loop by moth to see at what point it fails), once you have that probably to narrow down to the row/field. 

 

I believe you are using our connectors, other option could be to use Generic ODBC so you can use a third party driver, perhaps the one from the vendor and verify the outcome https://help.qlik.com/en-US/cloud-services/Subsystems/ODBC_Connector_help/Content/Connectors_ODBC/OD...

 

Hope it helps. 

 

Don't forget to mark as "Solution Accepted" the comment that resolves the question/issue. #ngm

View solution in original post

Mimiek
Partner - Contributor III
Partner - Contributor III
Author

Hi @NadiaB ,

So it turned out that there was a field and line in a certain table that had too many characters. Fortunately on a line that I didn't actually need, so I could filter it out when retrieving the data.

Thank you for the assistance!

View solution in original post

8 Replies
NadiaB
Support
Support

Hi @Mimiek 

The issue is likely to a value that is not supported.  The data types supported for Postgres are listed here https://help.qlik.com/en-US/cloud-services/Subsystems/ODBC_Connector_help/Content/Connectors_ODBC/Po...

 

If you don't see any specific column that has an invalid data type, best approach would be to narrow down to the value, as you have a lot of values probably best option would be to narrow down to a period (perhaps get data in a loop by moth to see at what point it fails), once you have that probably to narrow down to the row/field. 

 

I believe you are using our connectors, other option could be to use Generic ODBC so you can use a third party driver, perhaps the one from the vendor and verify the outcome https://help.qlik.com/en-US/cloud-services/Subsystems/ODBC_Connector_help/Content/Connectors_ODBC/OD...

 

Hope it helps. 

 

Don't forget to mark as "Solution Accepted" the comment that resolves the question/issue. #ngm
Mimiek
Partner - Contributor III
Partner - Contributor III
Author

Hi @NadiaB ,

So it turned out that there was a field and line in a certain table that had too many characters. Fortunately on a line that I didn't actually need, so I could filter it out when retrieving the data.

Thank you for the assistance!

Ken_T
Specialist
Specialist

thanks for posting this, helped us when we got this same error today.

NadiaB
Support
Support

@Mimiek , @Ken_T I’m glad to hear the issue was resolved. 

Don't forget to mark as "Solution Accepted" the comment that resolves the question/issue. #ngm
Fenil
Contributor III
Contributor III

Hi @NadiaB, We yesterday got the same direct access error in production, Could it be the same reason for this error? Or from the connector.

Connector error: Status(StatusCode="Internal", Detail="ERROR [HY000] [Apache Arrow][Flight SQL] (100) Flight returned timeout error, with message: Deadline Exceeded. gRPC client debug context: {"created":"@1724701257.430000000","description":"Deadline Exceeded","file":"C:\jenkins\c@2\warpdrive\vcpkg\buildtrees\grpc\src\85a295989c-6cf7bf442d.clean\src\core\ext\filters\deadline\deadline_filter.cc","file_line":81,"grpc_status":4}. Client context: OK") (DirectAccess-1512))

NadiaB
Support
Support

@Fenil 

This error has an error code, and seems Data Source side issue, did a quick search and found this: https://community.dremio.com/t/query-is-getting-cancelled-after-30-sec-when-run-from-odbc-client/110...

The message reported in the post above was for "failed to read data at" error that doesn't align with this, I suggest to create a separate post for different error messages, as the error that you are getting is " [HY000] [Apache Arrow][Flight SQL] (100) Flight returned timeout error, with message: Deadline Exceeded."

Don't forget to mark as "Solution Accepted" the comment that resolves the question/issue. #ngm
Fenil
Contributor III
Contributor III

@NadiaB Thanks for reply, Ok I will create new post for this, And it is saying direct access-1512, that's why I thought it could be similar.

Thanks again.

Ken_T
Specialist
Specialist

we found that one similar issue, turning off bulk reader option in the connection properties can help resolve this DirectAccess-1512 error.