Unlock a world of possibilities! Login now and discover the exclusive benefits awaiting you.
I am given a large (3.6gb) binary which includes data from all over the world and has around 57 tables. A section access is applied for a column in one of the 57 tables resulting me and my fellow countrymen only being able to work on rows relevant for my country (e.g. Malaysia).
I am trying to export all these tables into QVDs by performing a series of STOREs for each of the 57 resident tables into individual QVDs. However, the resulting QVDs are still that of the global binary (3.6gb). From there, I will load back the QVDs and the use app's resulting binary for downstream country-specific apps.
Is there a way to export into QVDs only the rows that Section Access allows me to?
I could perform a WHERE filter on the particular resident table with the country column but that would also require me to understand the complex join conditions across all the 57 tables for me to perform the subsequent INNER KEEPs.
Section Access data reduction happens when a user logs in to the app, however all of the data is stored in the model still. I don't believe there is a way to leverage Section Access to affect the STORE command.
I think you are heading in the right direction by using a where clause. If you don't want to go through the process of learning all the inner workings, this might help get you started.
You'll need to make you Section Access or Limit table be the center of the universe and then list all the table in your data model, with the appropriate table they are linking to. It will then loop through all the tables doing a left keep to limit the data. Keep in mind the order of operations.
// ********************************************************************************************************************************************************
// **
// ** Data Model Reduction
// **
// ********************************************************************************************************************************************************
/*
This script defines a table as the center of your universe, usually the Fact table. You then limit the center table by a where clause, that you are
wanting to limit the dataste by. The script after, then limits all the tables based on the relationship you then define withthe use of resident
loads and keep statements.
*/
// ****************************************************************************
// ** Sub Routine
// ****************************************************************************
Sub DataModelReduction(pTableField, pKeepField, pTable)
Let vDMR.TableCount = NoOfRows('$(pTable)'); // Determine number of tables.
For vDMR.t = 0 to $(vDMR.TableCount) - 1 // Loop through tables.
Let vDMR.TableName = Peek('$(pTablefield)',$(vDMR.t),'$(pTable)'); // Get Table Name to reduce
Let vDMR.KeepTableName = Peek('$(pKeepField)',$(vDMR.t),'$(pTable)'); // Get Joining table to limit
Rename Table '$(vDMR.TableName)' to '$(vDMR.TableName)_tmp'; // Rename Original Table
'$(vDMR.TableName)': // Load new table limiting the dataset to
Left Keep('$(vDMR.KeepTableName)') // the appropriate related table data
Load
*
Resident
'$(vDMR.TableName)_tmp';
Drop Table '$(vDMR.TableName)_tmp'; // Drop Original Table
vDMR.TableName=;vDMR.KeepTableName=; // Loop clean up
Next vDMR.t;
vDMR.t=;vDMR.TableCount;pTableField=;pKeepField=;pTable=; // Sub clean up
End Sub;
// ********************************************************************************************************************************************************
// ** Example
// ********************************************************************************************************************************************************
Binary [lib://Apps/Assignment.qvw] // Binary load applicaiton with DataModel you want to reduce.
/*
Manually limit the center of the universe table.
*/
Rename Table Assignment to Assignment_tmp; // Rename original table
Assignment:
NoConcatenate Load
*
Resident
Assignment_tmp
Where
_PastFlag <> 1; // Where clause limiting the Assignment table, which will perpetuate through the model.
Drop Table Assignment_tmp; // Clean Up
/*
Here we define the tables and their limiting counterpart. This happens sequentially, so
be careful to order in the tables in the order they need to be limited. For example,
If you switched table records for 'Employee' and 'Contact' below, the 'Contact' table will be limited
on a complete 'Employee' table, which would normally not be the intended result.
*/
LimitTables:
Load * Inline [
Table, KeepTable
Absence, Assignment
Organization, Assignment
Location, Assignment
Term, Assignment
PeopleGroup, Assignment
Salary, Assignment
ProposedSalary, Assignment
NewHire, Assignment
EffectiveDateLink, Assignment
MasterCalendar, EffectiveDateLink
To_Date, MasterCalendar
From_Date, MasterCalendar
Employee, Assignment
Contact, Employee
VetDetail, Employee
SupervisorHierarchy, Employee
AbsentEmployees, Employee
Job, Assignment
Competency, Job
WorkersComp, Job
];
Call DataModelReduction('Table', 'KeepTable','LimitTables'); // This results all the tables defined in the LimitTables table. Thoes not identified are left as is.
Section Access data reduction happens when a user logs in to the app, however all of the data is stored in the model still. I don't believe there is a way to leverage Section Access to affect the STORE command.
I think you are heading in the right direction by using a where clause. If you don't want to go through the process of learning all the inner workings, this might help get you started.
You'll need to make you Section Access or Limit table be the center of the universe and then list all the table in your data model, with the appropriate table they are linking to. It will then loop through all the tables doing a left keep to limit the data. Keep in mind the order of operations.
// ********************************************************************************************************************************************************
// **
// ** Data Model Reduction
// **
// ********************************************************************************************************************************************************
/*
This script defines a table as the center of your universe, usually the Fact table. You then limit the center table by a where clause, that you are
wanting to limit the dataste by. The script after, then limits all the tables based on the relationship you then define withthe use of resident
loads and keep statements.
*/
// ****************************************************************************
// ** Sub Routine
// ****************************************************************************
Sub DataModelReduction(pTableField, pKeepField, pTable)
Let vDMR.TableCount = NoOfRows('$(pTable)'); // Determine number of tables.
For vDMR.t = 0 to $(vDMR.TableCount) - 1 // Loop through tables.
Let vDMR.TableName = Peek('$(pTablefield)',$(vDMR.t),'$(pTable)'); // Get Table Name to reduce
Let vDMR.KeepTableName = Peek('$(pKeepField)',$(vDMR.t),'$(pTable)'); // Get Joining table to limit
Rename Table '$(vDMR.TableName)' to '$(vDMR.TableName)_tmp'; // Rename Original Table
'$(vDMR.TableName)': // Load new table limiting the dataset to
Left Keep('$(vDMR.KeepTableName)') // the appropriate related table data
Load
*
Resident
'$(vDMR.TableName)_tmp';
Drop Table '$(vDMR.TableName)_tmp'; // Drop Original Table
vDMR.TableName=;vDMR.KeepTableName=; // Loop clean up
Next vDMR.t;
vDMR.t=;vDMR.TableCount;pTableField=;pKeepField=;pTable=; // Sub clean up
End Sub;
// ********************************************************************************************************************************************************
// ** Example
// ********************************************************************************************************************************************************
Binary [lib://Apps/Assignment.qvw] // Binary load applicaiton with DataModel you want to reduce.
/*
Manually limit the center of the universe table.
*/
Rename Table Assignment to Assignment_tmp; // Rename original table
Assignment:
NoConcatenate Load
*
Resident
Assignment_tmp
Where
_PastFlag <> 1; // Where clause limiting the Assignment table, which will perpetuate through the model.
Drop Table Assignment_tmp; // Clean Up
/*
Here we define the tables and their limiting counterpart. This happens sequentially, so
be careful to order in the tables in the order they need to be limited. For example,
If you switched table records for 'Employee' and 'Contact' below, the 'Contact' table will be limited
on a complete 'Employee' table, which would normally not be the intended result.
*/
LimitTables:
Load * Inline [
Table, KeepTable
Absence, Assignment
Organization, Assignment
Location, Assignment
Term, Assignment
PeopleGroup, Assignment
Salary, Assignment
ProposedSalary, Assignment
NewHire, Assignment
EffectiveDateLink, Assignment
MasterCalendar, EffectiveDateLink
To_Date, MasterCalendar
From_Date, MasterCalendar
Employee, Assignment
Contact, Employee
VetDetail, Employee
SupervisorHierarchy, Employee
AbsentEmployees, Employee
Job, Assignment
Competency, Job
WorkersComp, Job
];
Call DataModelReduction('Table', 'KeepTable','LimitTables'); // This results all the tables defined in the LimitTables table. Thoes not identified are left as is.