I am using QV9 v9.00.7320.0409 , QvSAPDSOConnector v 5.3 or QvSAPOLAPConnector v 5.3.
I am trying to extract data from SAP ODS's using the QvSAPDSOConnector with various results depending on settings of the query code and/or volumes of data in the ODS's.
What I am looking for is some advice/knowledge from the community on the logic behind the results I am getting and a possible solution. QvSAPDSOConnector
I have an ODS with 600k records in and using this code I can extract all the 600k records by 65 characteristics ( approx 39 million cells of information ?)
CUSTOMCONNECT TO "Provider=QvSAPDSOConnector;ASHOST=SERVER_NAME;SYSNR=10;CLIENT=500;
SQL Select (NoKey) * from ODS_NAME;
Store * from [ODS_NAME] into D:\FILE_NAME.QVD;
Drop table [ODS_NAME];
I am successful with this and a similar success when connecting to another ODS with 100k records by 90 characteristics ( approx 9 million cells of information ?)But when trying the same code on an ODS with 18m rows (eg. years 2001-2010) in it by 90 characteristics, I don't get the same results and get the error below.
I try a reduced ODSMaxRows at 500k and the result is saved into a .QVD . I will get 500k rows in there, but it won't be the first 500k records of the ODS in the way I would expect, e.g. 500k records from year 1, it is more or less evenly spread across 6 of the years, 2001-2007. From 0CALDATE 01/01/2001 until 10/12/2007, some transactions every day, no days missing, 2536 days in total. Is there some significance with this number ?
I also get this error when the script has finished. No big deal as long as the .QVD is created I am not too worried, but I would prefer not to get it or at least understand why.
With the ODSMaxRows at 1,000,000, the result is saved into a .QVD . I will get 1m rows in the .QVD, but again it won't be the first 1m records, it will be spread across the years up to 2007. Exactly the same dates as above. Same error message.
Increasing the ODSMaxRows above 1,000,000, results in a different error and no .QVD.
I have tried both 1,500,000 and 1,00,100 and get the error.
This, I believe, is due to table space allocation within SAP server/environment. I have increased the table space allocation to the max size with in our system and that recomended by SAP.
I have also tried a preceeding where 0CALYEAR = 2002 statment, that does work at getting the records and showing the ones from 2002, but not 1m of them, it will still extract the same amount as from the above method.....strange ?
Seeing that the default setting for ODSMaxRows is 10,000,000 would suggest that it is possible to extract that much from SAP ODS's, does anyone have experience of extracting these volumes or higher with a QvSAPDSOConnector I would also be grateful.
We have also tried with the OLAP connector but run into similar problems with more than 50 characteristics with associated attributes etc etc.
Any help on this subject would be very helpful indeed.
Yes, Z0TOUAV5 is a DSO.
I am now using the QvSAPConnector and do not have any problems but there are some things you can't get usng this method, like navigatable attribute information.
I haven't revisited the DSO or OLAP methods for a long time, but must soon.
how you are performing incremental extraction from SAP BW Cube ?.
pls give me details if possible for you .
I am extracting from DSO's at the moment at atomic level. I am not using cubes. You would have to set up the cubes in such a way that there is a date changed field on the lowest level of data in the cube, probably day level, then this principle would work.
We have a field that has the last date that a record was changed and I extract where that field has a value of today()-4. That way on a Monday it will pickup transactions from Friday and the weekend, just in case things fall over in the system over the weekend not unusual for SAP. I save this file as a daily delta QVD.
Once I have that, I load my historic QVD where the 'new' transaction ID's do not exist in the historic file, and then resave as the Historic QVD. You must be careful when you are setting this up, if you do a partial reload you will effectively delete the records in your history file so make sure you have a historic file backup.
supporse I want to perform samethings on SAP R/3 . how it is possible to extract huge sap cluster table like KONV,MSEG?