Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 
Not applicable

SAP EXPORT_BUFFER_NO_MEMORY error

Some Qlikview extracts are abending on the SAP side with the message "EXPORT_BUFFER_NO_MEMORY".
The error points to the parameter "rsdb/obj/buffersize".

In Qlikview the error appears as :


Custom read failed
SQL SELECT *
FROM KNA1



Running RSPFLDOC via SE38 shows the parameter values:

Appl.area: General system
Parameter Type: Integer Value
Changes allowed: Change permitted
Valid for oper.system: All
Minimum: 16
Maximum: 256,000
DynamicallySwitchable: (blank)
Same on all servers: (blank)
Dflt value: 4096
Profile value: 20000
Current value: 20000

Has anyone else encountered this error?
How do others have "rsdb/obj/buffersize" configured?
Thanks,
Ben

27 Replies
Not applicable
Author

I tried upgrading to 5.4 but performance was much worse with our 4.6c system.
In the release notes it mentioned that 5.4 would only for 6.0 and above.

Also, I've had several problems with "Select * " on large ( wide ) tables. It works much better when fields are explicitly listed.

tmumaw
Specialist II
Specialist II

After working a few days with QlikView Tech Services, and numerous modifications to our connect string we have come to the conclusion the problem is in the Packetsize=n (20K) default value. We tried adjusting it from 20k down to 5K and got the 5K to run with no problems in (9.0 SR4 / SAPConnector 5.4 SR1). Do not use the default value, you will have to enter the Packetsize=5000K in your connect statement manually. They have logged it as bug ID 29004 with their QA department. If approved the bug will hopefully be fixed in the next Service Release - however due to uncertainties regarding potential related problems and program priorities they cannot guarantee a fix date at this time.

They will keep me updated on any new developments in this regard.

Original Case Description:

Customer is experiencing timeouts when reloading from SAP after upgrade to v9 SR4. The logfiles and the SAP document for additional information is attached to the case.

Lars_Wahlstedt
Employee
Employee

The reasons you got timeouts on the SAP side was several APPLYMAP statements in the preceding load statement that needed too much time. Our suggestion was to move the applymap statements to a Load ... Resident statement after the SELECT statement.

By doing this you can keep the Packetsize at 20000 and get much better speed for the download.

In a future release we will increase the Timeout limit for the SAP side and include a recommendation that extensive manipulation in the LOAD statement should be done in a LOAD.... Resident statement after the SELECT. The Packetsize default value will not be changed.

Kind Regards,

Lars Wahlstedt

tmumaw
Specialist II
Specialist II

Lars,

Why would it work fine in our Production environment running 8.5 / 5.3 connector? Is it a memory issue on the SAP side or QlikView side? My production environment has 132Gig, where my development environment has 16Gig on a virtual box. Thanks Thom

Lars_Wahlstedt
Employee
Employee

we reduced the SAP side timeout from 60 sec to 30 sec in 5.4, your APPLYMAP needed 55 sec. So you have been lucky in 5.3 🙂

/ Lars W

tmumaw
Specialist II
Specialist II

Lars,

Is there a parameter on the connect string that I can modify the SAP timeout to take advantage of the 20K? Or is that controlled through the transports

Thanks

Thom

tmumaw
Specialist II
Specialist II

Lars,

Am I able to modify the timeout parameter in the connect string? I am trying to extract financial data from GLPCA. When I cut down the number of rows extracted it works, but the user has sent me their requirements and I had to increase the number of rows to extract. Now the job aborts. What are the work-arounds? The job is aborting on the SAP side with the following message:

Date Time Message text Message class Message no. Message type

08/02/2010 13:40:23 Job started 00 516 S
08/02/2010 13:40:23 Step 001 started (program /QTQVC/READ_DATA, variant &0000000001216, user ID QLIKVIEW) 00 550 S
08/02/2010 18:58:50 Timeout reached while waiting for clear memory. 00 001 E
08/02/2010 18:58:50 Job cancelled after system exception ERROR_MESSAGE 00 564 A

Lars_Wahlstedt
Employee
Employee

Hi Thom,

you are not able to modify this specific timeout. Make sure you do no manipulations in the Load statement in the QV script, just store the data directly to a QVD-file. The reason you get timeouts is that QV fetches data from SAP too slowly.

Regards, Lars

tmumaw
Specialist II
Specialist II

I am trying to extract from SAP our GLPCA table. When I extract just a few rows for years 2005 - 2010 it runs fine, but when I extract all the rows the user has requested it aborts. I am trying to run one year at a time and merge the files in Qlikview. Years 2005 - 2009 will never change, so it will be a one time extract. Thanks for your help. I will let you know the out come

Thom

suniljain
Master
Master

The main reason for time out is when qlikview connector donot get any response from SAP R/3 within specific time. because there are lots of activity running on production server on database . so some time server engage in doing that activity and unable to give response to connector request. this types of issue normaly happen in SAP R/3 environment.

Sometime low cache memory is also responsible for time out reason.

Regards

Sunil Jain.