Skip to main content
Announcements
Join us at Qlik Connect for 3 magical days of learning, networking,and inspiration! REGISTER TODAY and save!
cancel
Showing results for 
Search instead for 
Did you mean: 
Not applicable

QV Connector vs Xtract QV (THEOBALD)

Hi,

I'm currently evaluating the available connectors for QV to SAP in the market.
and we have QV connector and Xtract QV by THEOBALD (http://www.theobald-software.com/en/products/xtractqv.htm)

I did a light test of both for BW queries & SAP tables,

In term of GUI and ease of use, i think Xtract QV is easier to use and it is easy to set up, copy-paste the script to QV and reload, very simple and short steps to get what you want. but in 2 different interfaces.

as for QV connector, the GUI is ok but could be better, good thing is everything is done in QV, but for now, i'm stuck, i got my script, but it wont reload, waiting for solution from the support.

Has anyone been through the same evaluation process?
Can you share some info and concern about acquiring the connector?

PS:
Does anyone know if SAP charges the client for using this kind of connectors to extract SAP data?
or it has been included in part for the connector's lic?

Also, the amount of data and total columns widths?


Thanks.

1 Solution

Accepted Solutions
Not applicable
Author

Hi,

so many questions :-). Let me do a step by step:

  • Payment: definitely no need for that. You are allowed to pull whatever you want - as long as you stay with SAPs standard extraction methods / interfaces. Theobald provides four of them: extraction via SAP table or view, SAP query, RFC or BAPI, ABAP Reports, BW Cube / Query, Open Hub, BW Hierarchies, Delta Q.
  • Dev and Prod: no different license avaiable. You buy one and use it for any purpose.
  • Longest extraction time: it depend's mostly on you data model / efficency, volume and extraction method (see above). For SAP BW masterdata (volume up to 10 Mio. records) you can stay with table extraction. This works like a charme. However, the bottom line is a limitation to ca. 120 source attributes. Haven't figured out why (SAP and/or XQV limitation). Workaround: split the extraction into multiple parts and join them in QV into one. Extracting transactional data is a little bit more tricky: query extraction method is limited by sap to a maximum of 1 Mio. (output records). Our cubes are realy large (300 Mio. Rows). Lessons learned: Here comes open hub into play. Forget also about cube extraction (very slow). If you don't want to stick with open hub, consider taking table extraction table (!!!) extraction for high volume cube data. Extract F and E table data (DIM), which is very, very fast. Even for many attributes / measures. Additionaly you have to extract the SID-tables. Join them and voila...the classical BW star shema is transfered to QV.
  • Errors / strange behaviour within XQV: almost none. XQV works pretty stable.
  • Hardware: we run 64-bits-machine only.

Cheers,

Nl

View solution in original post

15 Replies
suniljain
Master
Master

"i'm stuck, i got my script, but it wont reload, waiting for solution from the support."

Not getting your above statement. can you post more details.

Not applicable
Author

Hi Sunil,

That means, i'm able to generate the connection string - CUSTOMCONNECT TO

then select the fields i want, got the full script and when i click RELOAD, i got an error with no specific error message telling me what the problem is.

[CUB_FI]:
Load *;
Select PseudoMDX ( .........

by the way, which connector have you tried?



suniljain
Master
Master

We are using SAP Connector. It works fine with SAP BW.

Not applicable
Author

Hi Sunil,

Is there any cost incurred by SAP for using its data externally?

I'm not sure if the connector lic include the rights to us SAP data externally or we will have to pay to SAP separately?

Thanks.

guytzumer
Partner - Contributor III
Partner - Contributor III

The data are yours and the connector works fine

It's no metter of SAP company, they can not charge you on this metter.

as for the connector to R/3 - it is easy and intuitive to use. efter the installation (if you done it right with the right transports)

you have the ScriptBuilder that mapping all your Sap Tables and fields, and In addition, if you choose Table it make the script to

pull the Table... very easy - Powerful tool In my opinion!

suniljain
Master
Master

We do not have to pay any cost to SAP AG.

Just cost is of SAP Connector Only.

Not applicable
Author

We are using the Theobald-Extractor succesfully in our Dev and Prod environment. Don't hesitate to ask for our experience and/or adress existing issues. We have many experiences to share.

Not applicable
Author

Hi,

Did you pay SAP for extracting their data?
If your answer is NO,
then, my next question is,
did you consult SAP whether it's chargable to use XtractQV to extract SAP data?

You mentioned Dev and Prod. Did you mean they have dev lic at cheaper cost? or you simply bought 2 lics from them?

As of today, what is your longest extract time from SAP?
what are the most commonly error from the XtractQV you have on your server?
Are you running on 64 bits machine?

Thanks.

Not applicable
Author

Hi,

so many questions :-). Let me do a step by step:

  • Payment: definitely no need for that. You are allowed to pull whatever you want - as long as you stay with SAPs standard extraction methods / interfaces. Theobald provides four of them: extraction via SAP table or view, SAP query, RFC or BAPI, ABAP Reports, BW Cube / Query, Open Hub, BW Hierarchies, Delta Q.
  • Dev and Prod: no different license avaiable. You buy one and use it for any purpose.
  • Longest extraction time: it depend's mostly on you data model / efficency, volume and extraction method (see above). For SAP BW masterdata (volume up to 10 Mio. records) you can stay with table extraction. This works like a charme. However, the bottom line is a limitation to ca. 120 source attributes. Haven't figured out why (SAP and/or XQV limitation). Workaround: split the extraction into multiple parts and join them in QV into one. Extracting transactional data is a little bit more tricky: query extraction method is limited by sap to a maximum of 1 Mio. (output records). Our cubes are realy large (300 Mio. Rows). Lessons learned: Here comes open hub into play. Forget also about cube extraction (very slow). If you don't want to stick with open hub, consider taking table extraction table (!!!) extraction for high volume cube data. Extract F and E table data (DIM), which is very, very fast. Even for many attributes / measures. Additionaly you have to extract the SID-tables. Join them and voila...the classical BW star shema is transfered to QV.
  • Errors / strange behaviour within XQV: almost none. XQV works pretty stable.
  • Hardware: we run 64-bits-machine only.

Cheers,

Nl