Skip to main content
Announcements
Live today at 11 AM ET. Get your questions about Qlik Connect answered, or just listen in. SIGN UP NOW
cancel
Showing results for 
Search instead for 
Did you mean: 
Karthik3
Creator III
Creator III

Error when exporting Large tables from SAP via connector

Hi,

    I am getting the following error when extracting BSEG table..I am using Qlikview 10 SR3 and sap connector 5.7.It was working fine two days ago but suddenly I am unable to extract Large tables.

SQL SUBSELECT * FROM BSEG WHERE BELNR IN ( SELECT BELNR FROM BKPF WHERE BLDAT >='20120101')

Error:

QVX_UNEXPECTED_END_OF_DATA : Fetch Aborted after 241 retries.Key = TIMEOUT_READ_MEMORY

Timeout when trying to read shared buffer

It was working fine before but recently I upgraded the SAP Transports and Qlikview connector.Is this problem from Qlikview side or did someone change anything from SAP side?Any help please?

2 Replies
Not applicable

I am also getting the problem.... this is most likely a bug with version 5.7 of the connector. 

I found that if I specify the fields in the query, not use select *, AND take out all custom fields (those starting with Z), it will work. Then I can put the z fields back in, and run it again. I know this sounds crazy, I am still debugging the ABAP used for the SAP Connector to see how it's creating it's SQL statement to send to the database.

So change this:

select * from TABLEX

to this:

select c1 c2 c3 from TABLEX

now run it....

now change it to this:

select c1 c2 c3 z1 z2 .... from TABLEX

I'll keep you posted on the debug results. We may have to upgrade to 5.8. Also still investigating a possible note that would fix SAP.

update:

it appears that my first run, which failed, was run with a packet size of 20000, even thought I had it set to 5000 in the connection string. Apparently taking out the Z fields, running it, then putting them back in made the custom connect set the packet size correctly.

Not applicable

hi karthikmungi

I have same problem with you.. My BSEG has 160m+ row, and I use parameter like BufferPercentage = 2 and TimeOutFetch = 20000 or bigger, but it's still show same error.. Do you fix it yet?

rnetherton do you mean that we must join table without z field with table with z field?