Unlock a world of possibilities! Login now and discover the exclusive benefits awaiting you.
Some Qlikview extracts are abending on the SAP side with the message "EXPORT_BUFFER_NO_MEMORY".
The error points to the parameter "rsdb/obj/buffersize".
In Qlikview the error appears as :
Custom read failed
SQL SELECT *
FROM KNA1
Running RSPFLDOC via SE38 shows the parameter values:
Appl.area: General system
Parameter Type: Integer Value
Changes allowed: Change permitted
Valid for oper.system: All
Minimum: 16
Maximum: 256,000
DynamicallySwitchable: (blank)
Same on all servers: (blank)
Dflt value: 4096
Profile value: 20000
Current value: 20000
Has anyone else encountered this error?
How do others have "rsdb/obj/buffersize" configured?
Thanks,
Ben
Your buffersize seems to be very low, but it all depends on how much data you are exporting. How many record you have in KNA1?
I got following buffer size in my system, but this could vary in many ways and depends on many other profile parameters. You should check with your basis guys what can be adjusted with exisitng hardware (available memory, etc.).
Dflt value 4096
ProfileVal 400000
Current value 400000
Hope this helps.
Your buffersize seems to be very low, but it all depends on how much data you are exporting. How many record you have in KNA1?
I got following buffer size in my system, but this could vary in many ways and depends on many other profile parameters. You should check with your basis guys what can be adjusted with exisitng hardware (available memory, etc.).
Dflt value 4096
ProfileVal 400000
Current value 400000
Hope this helps.
Thanks for the quick response. KNA1 has 90,000 records.
It's not too much. Ask your basis guys to analyze the ABAP dump and adjust buffer parameters. You may also want to play with SAP Connector settings (buffer, packet sizes, etc.) it's all documented well in SAP Connector installation guide.
Rakesh,
Hi. We just upgraded to 9.0 SR4 / SAPConnector Version 5.4. We were on 9.0 SR3 / SAPConnector 5.3 in our development environment and things were working fine. Well a new release of connector came out and we upgraded. Now we are getting the following:
SQL Select *
5/19/2010 13:34:32.1026415 Information 05/19/10 13:34:31: 1472 FROM KNVV
5/19/2010 13:34:32.1026415 Information 05/19/10 13:34:31: 12 fields found: %KNVV, Ship To, %SalesOffice_Key, Customer Emp NonEmp, Customer Acct Assignment Grp, Customer Payment Terms, Customer Invoice Periods, Customer Grp1, Customer Grp2, Customer Grp3, Customer Acquisition,
5/19/2010 13:42:43.3717842 Information Customer Grp5, 1,759,503 lines fetched
5/19/2010 13:42:46.6534752 Information 05/19/10 13:42:43: 1475 store KNVV into qvd\KNVV_SalesOrg.qvd
5/19/2010 13:42:46.8722546 Information 05/19/10 13:42:46: 1478 drop table KNVV
5/19/2010 13:42:46.8722546 Information 05/19/10 13:42:46: 1485 [ADRC]:
5/19/2010 13:42:46.9035088 Information 05/19/10 13:42:46: 1486 Load
5/19/2010 13:42:46.9035088 Information 05/19/10 13:42:46: 1487
5/19/2010 13:42:46.9035088 Information 05/19/10 13:42:46: 1488 [ADDRNUMBER] as [Customer AddrKey],
5/19/2010 13:42:46.9191359 Information 05/19/10 13:42:46: 1489 [HOUSE_NUM1] as [Customer House Number],
5/19/2010 13:42:46.9191359 Information 05/19/10 13:42:46: 1490 [STREET] as [Customer Street],
5/19/2010 13:42:46.9191359 Information 05/19/10 13:42:46: 1491 [SORT1] as [Customer Search Term 1],
5/19/2010 13:42:46.9347630 Information 05/19/10 13:42:46: 1492 [SORT2] as [Customer Search Term 2]
5/19/2010 13:42:47.2629321 Information 05/19/10 13:42:46: 1494 SQL SELECT *
5/19/2010 13:42:47.2629321 Information 05/19/10 13:42:46: 1495 FROM ADRC
5/19/2010 13:42:47.2629321 Information 05/19/10 13:42:47: 5 fields found: Customer AddrKey, Customer House Number, Customer Street, Customer Search Term 1,
5/19/2010 13:51:26.3483128 Information Customer Search Term 2, Error: Custom read failed
5/19/2010 13:51:26.5670922 Information 05/19/10 13:51:26: General Script Error
5/19/2010 13:51:26.5670922 Information 05/19/10 13:51:26: Execution Failed
5/19/2010 13:51:26.5670922 Information 05/19/10 13:51:26: Execution finished.
We have checked all authorizations in SAP, did a trace in SAP and everything is fine. Any ideas?
I checked our buffer sizes and they are 4096, 55000, 55000. We are trying to process 1.8 million rows of data.
Thanks
Thom
Hi Thom,
Looking into error log you posted, it seems there is a problem with your script and nothing with SAP. Possible that you could post your QVW here?
Thanks Rakesh. I am out of the office sick today, but I will post it tomorrow. We upgraded our DEV environment to 9.0 SR3 with SAPConnector 5.3 and everything ran fine. We then upgraded one more time to 9.0 SR4 with SAPConnector 5.4. I made no changes to my QVW. Is there something different in 5.4 vs. 5.3 of the SAPConnector?
Thanks
Thom
Nope, nothing I noticed. As long as you have imported the new transport requests, and correct versions of them based on your R/3 version, it should work just fine. Do post your QVW and I can run it through my env to see if it works for me or not.
Get well soon.
Rakesh
Rakesh,
Maybe you can explain this to me. Most of my select statements are select *, they all work except for one (ADRC) table in SAP. I listed the fields I am selecting and it works fine. Here is one that I listed the fields I am selecting and it aborts.
Custom read failed
SQL Select ZORDER ZACTIVITY ZSTATUS ZWDATE ZHOURS
from ZTRPST where ZWDATE >= '20100103' and ZEMPEQP = 'P'
and ZORDER between '000040000000' and '000049999999'
Thanks
Thom
We upgrade to 9.0 SR4 and SAP Connector 5.4 SR1. We are now getting the following error on the SAP side. Any ideas?
Timeout reached while waiting for clear memory