Skip to main content
Announcements
Join us at Qlik Connect for 3 magical days of learning, networking,and inspiration! REGISTER TODAY and save!
cancel
Showing results for 
Search instead for 
Did you mean: 
sparur
Specialist II
Specialist II

some problems with QV SAP Connector 5.4 SR1

Hello, friends.

I have some strange problems with the latest version of SAP Connector. Do you have something like this?

After update to the latest version my reload job is failed often, but not always.

I did some investigations and I understand that problem with getting data. My query became to return different number of rows, i.e. sometimes 'select' query return 0 rows (but I know that in the SAP table rows are exists) and sometimes return all rows (as I want).

Somebody has any ideas? why one query sometimes returns different data ?

20 Replies
suniljain
Master
Master

is most likely that this is a DataSource that does not send delta data to the BW System via the delta queue but directly via the extractor . You can display the current delta data for these DataSources using TA RSA3 (update mode ='D')

suniljain
Master
Master

It is most likely that a delta initialization had not yet run or that the the delta initialization was not successful. A successful delta initialization (the corresponding request must have QM status 'green' in the BW System) is a prerequisite for the application data to be written to the delta queue.

sparur
Specialist II
Specialist II
Author

no, Delta load on SAP BW is finished. Approximately refresh data on SAP BW finish in 8.00 - 8.15 AM, and after this in 8.45 I reload my QVW extract.

so I think may be it's a bug in SAP Connector 5.4 SR1...

sparur
Specialist II
Specialist II
Author

Do you use such version of SAP Connector (5.4 SR1) ?

my version of QV Server is 8.50.6299.0409.10 and

SAP Connector: 5.4 SR1 (build 8342)

suniljain
Master
Master

pls try in qlikview

9.00.7320.0409 version. It will solve your poblem.



sparur
Specialist II
Specialist II
Author

it's not so simple 🙂 and I don't sure that it solve that problem. because I didn't found any information about inconsistency SAP Caonnector 5.4 and QlikView 8.5.

sparur
Specialist II
Specialist II
Author

I found some interesting things in TRACE log.

1) Trace log for situation when reload successfully and getting data from BW:

Trace.TIMESTAMP Trace.TASKTYPE Trace.TRACE
20100713055846 O Incoming OPEN_STREAM started. Packetsize: 20000 , Timeout: 5 , Conversion Routine: 0 , Buffer Percentage: 010
20100713055846 O SQL syntax check OK. Packetsize: 20000 , Timeout: 5 , Conversion Routine: 0 , Buffer Percentage: 010
20100713055846 O Output-Table: DATA512 . Packetsize: 20000 , Timeout: 5 , Conversion Routine: 0 , Buffer Percentage: 010
20100713055846 O Reading Job Scheduled. Packetsize: 20000 , Timeout: 5 , Conversion Routine: 0 , Buffer Percentage: 010
20100713055846 B Reading Job started. SAP job number: 01584500 , SAP job name: /QTQVC/READ_DATA . Packetsize: 20000 , Timeout: 5 , Conversion Routine: 0 , Buffer Percentage: 010
20100713055846 B SQL syntax check OK. Packetsize: 20000 , Timeout: 5 , Conversion Routine: 0 , Buffer Percentage: 010
20100713055846 B Fetch-Cursor OK. Packetsize: 20000 , Timeout: 5 , Conversion Routine: 0 , Buffer Percentage: 010
20100713055846 B Export to shared buffer OK. Packetsize: 20000 , Timeout: 5 , Conversion Routine: 0 , Buffer Percentage: 010
20100713055846 B Fetch-Cursor found no data. Packetsize: 20000 , Timeout: 5 , Conversion Routine: 0 , Buffer Percentage: 010
20100713055846 B Reading Job ended without errors. Packetsize: 20000 , Timeout: 5 , Conversion Routine: 0 , Buffer Percentage: 010
20100713055847 F Import data from Memory OK.
20100713055847 F Reading job has reached EOF.


2) Trace for situation when reload successfully BUT NO getting data from BW:

Trace.TIMESTAMP Trace.TASKTYPE Trace.TRACE
20100713062833 O Incoming OPEN_STREAM started. Packetsize: 20000 , Timeout: 5 , Conversion Routine: 0 , Buffer Percentage: 010
20100713062833 O SQL syntax check OK. Packetsize: 20000 , Timeout: 5 , Conversion Routine: 0 , Buffer Percentage: 010
20100713062833 O Output-Table: DATA512 . Packetsize: 20000 , Timeout: 5 , Conversion Routine: 0 , Buffer Percentage: 010
20100713062833 O Reading Job Scheduled. Packetsize: 20000 , Timeout: 5 , Conversion Routine: 0 , Buffer Percentage: 010
20100713062833 B Reading Job started. SAP job number: 02283300 , SAP job name: /QTQVC/READ_DATA . Packetsize: 6095 , Timeout: 5 , Conversion Routine: 0 , Buffer Percentage: 010
20100713062833 B SQL syntax check OK. Packetsize: 6095 , Timeout: 5 , Conversion Routine: 0 , Buffer Percentage: 010
20100713062834 B Fetch-Cursor OK. Packetsize: 6095 , Timeout: 5 , Conversion Routine: 0 , Buffer Percentage: 010
20100713062834 B Export to shared buffer OK. Packetsize: 6095 , Timeout: 5 , Conversion Routine: 0 , Buffer Percentage: 010
20100713062834 B Fetch-Cursor found no data. Packetsize: 6095 , Timeout: 5 , Conversion Routine: 0 , Buffer Percentage: 010
20100713062834 B Reading Job ended without errors. Packetsize: 6095 , Timeout: 5 , Conversion Routine: 0 , Buffer Percentage: 010
20100713062834 F Reading job has reached EOF.

and I see at least 2 different things:

1) 20100713062833 B Reading Job started. SAP job number: 02283300 , SAP job name: /QTQVC/READ_DATA . Packetsize: 6095 , Timeout: 5 , Conversion Routine: 0 , Buffer Percentage: 010



Why packet size has value = 6095 ? in my connection string I use default value for this parameter (20 000)

2) in second situation (when I don't get data) I don't see a row (in trace log) with Import data from memory...

sparur
Specialist II
Specialist II
Author

So, I think that problem in strange packet size for second situation... And I don't know how we can fix it.

Can anyone help me?

pablolabbe
Luminary Alumni
Luminary Alumni

Go back to previous working version of sap connector. Did you have contacted Qliktech Support or your Partner Support ?

sparur
Specialist II
Specialist II
Author

Hello Pablo.

Of course, We have done a rollback on previous version of SAP Connector. And we contacted with QV support team, but we didn't get an answer, unfortunately it's not so quickly as we want.