Unlock a world of possibilities! Login now and discover the exclusive benefits awaiting you.
Some Qlikview extracts are abending on the SAP side with the message "EXPORT_BUFFER_NO_MEMORY".
The error points to the parameter "rsdb/obj/buffersize".
In Qlikview the error appears as :
Custom read failed
SQL SELECT *
FROM KNA1
Running RSPFLDOC via SE38 shows the parameter values:
Appl.area: General system
Parameter Type: Integer Value
Changes allowed: Change permitted
Valid for oper.system: All
Minimum: 16
Maximum: 256,000
DynamicallySwitchable: (blank)
Same on all servers: (blank)
Dflt value: 4096
Profile value: 20000
Current value: 20000
Has anyone else encountered this error?
How do others have "rsdb/obj/buffersize" configured?
Thanks,
Ben
Dear All,
I am trying yo extact Delta of DSO but everytime I get error "Custom Read Failed".
DSO is of billing condition data and it is not indexed. that contain 470000000 records.
can you suggest me what parameter I have to set in connection string to extract delta ?.
Regards;
Sunil jain
Did you try adjusting the packetsize ? Default is 20k. I had the same problem and I took mine down to 5k and it's working fine now.
Issue resolved after changing timeout setting
Hi Sunil,
Which parameter did you amend in the QV SAP Connect string and to what value please?
We are having same issue with QV 10 SR1 using SAP Connector 5.50 SR3
Thanks
I add following extra parameter in connection string to extract cluster table.
TimeOutBatch=1800;
TimeOutFetch=3600;
TimeOutStartBatch = 4200;
PacketSize = 5000
Thanks Sunil, will give this a go 🙂
I have different connect strings based on what I am extracting from SAP. For example if I am extracting VBAK / VBAP data I use a smaller packet size (5000), otherwise I used the default.
Dear Thom,
I am laso using different connection string based on nature of table or nature of the source .
🙂