Do not input private or sensitive data. View Qlik Privacy & Cookie Policy.
Skip to main content

Announcements
Qlik GA: Multivariate Time Series in Qlik Predict: Get Details
cancel
Showing results for 
Search instead for 
Did you mean: 
Diego_780
Creator
Creator

SUB routine apparently not working to make prefixes on table dimensions (like qualify)

Hello qlik pros!

I need some help because i don´t understand how it works what i have.

Apparently, I have a Subroutine that takes the table name, makes a prefix and puts it to each dimension of that table and renames it (basically a qualify):

SUB PrefixFields(vTableName)
    LET vNumFields = NoOfFields('$(vTableName)');
    FOR i = 1 TO $(vNumFields)
        LET vField = FieldName($(i), '$(vTableName)');
        // Evitamos campos técnicos o ya renombrados
        IF Left('$(vField)', 2) <> '__' AND Index('$(vField)', '$(vTableName).') = 0 THEN
            LET vNewField = '$(vTableName)-' & '$(vField)';
            RENAME FIELD [$(vField)] TO [$(vNewField)];
            LET vField2 = FieldName($(i), '$(vTableName)');
        ENDIF
    NEXT i
END SUB

LET vURL = '"http://ws.visualnacert.com/vnwebservices/user/$(vUserid)/v3/persontypes"';

RestConnectorMasterTable:
SQL SELECT 
    "__KEY_root",
    (SELECT 
        "id",
        "value",
        "__FK_data"
     FROM "data" FK "__FK_data")
FROM JSON (wrap on) "root" PK "__KEY_root"
WITH CONNECTION ( URL $(vURL));

[PersonTypes_data]:
LOAD    
    [id],
    [value]
RESIDENT RestConnectorMasterTable
WHERE NOT IsNull([__FK_data]);
DROP TABLE RestConnectorMasterTable;

CALL PrefixFields('PersonTypes_data');

 

When i go to see the visor, i see it perfectly, for now, i have 3 tables and they do what its said to be programmed:

Diego_780_0-1761122217112.png

But when i try to store them into qvds with this script, it doesnt store with that prefix, and instead, stores the original names (id, desc):

For i = 0 to NoOfTables()-1
    LET vTabName = TableName(0);
    STORE [$(vTabName)] INTO [$(vRutaQVD_VISUAL)$(vTabName).qvd] (qvd);
    DROP TABLE [$(vTabName)];
Next i

 

How is that possible or what am I doing wrong exactly? If i see that the SUB and CALL apparently makes the job done.

Thanks in regards,

Diego.

Labels (4)
3 Replies
marcus_sommer

You may not look on the wanted qvd else on elder (temporary) ones because the store-routine used tablename(0) instead of tablename($(i)). Further if I remember correctly needs this kind of loop-iteration to be reversed if it contains a drop-logic - means something like:

for i = nooftables()-1 to 0 step -1

To check this kind of logic you could comment the store + drop and adding a trace-statement like:

trace $(i) - $(TabName);

Further I could imagine that you may get the above described results with the qualifying-statements, maybe like:

qualify *;
unqualify __*;

load ...;
store ...;

unqualify *;

Beside this I suggest to rethink the entire approach because this qualifying caused later a lot of efforts to remove it again or leading to complex and unsuitable data-models. A BI data-model should be de-normalized as much as possible. If a tracking/differentiating of the data is needed this information could be added as an extra source-field within the table.

Diego_780
Creator
Creator
Author

Hello Marcus,

 

Thanks for the response, I use this SUB function so I don´t have to use qualify * and unqualify* everwhere, because there are hundreds of calls to this API REST, so to try to optimize the script, I thought about this.

Also, because there are so many calls, a lot of the times the dimensions come with the same names, so I want to qualify* with the name of the table, bc it´s the only way i can see diferentieting the dimensions.

Any tips?

 

Thanks.

marcus_sommer

It's very common that many tables share various equally and/or similar fields, like dates, names, amounts ... But this doesn't mean that they couldn't be differentiated without a qualifying because their source-table defines the content - and just by adding the table-name as an extra field will in general be sufficient. 

A BI data-model shouldn't contain hundreds of separate tables else they should be merged as much as possible in the direction of a star-scheme with a single fact-table and n dimension-tables. This means n concatenate + join measurements will be performed to get such result and it may end in tables like:

IDDateTypeAmount
12025/10/17Order1
12025/10/18Billing1
12025/10/19Shipment1


Approaches to keep such facts within n fact-tables and/or using n date-fields (Orderdate, Billingdate, Shipmentdate) for it are of course possible but will cause more complexity + much more efforts (development + maintaining) and usually a slower performance and disadvantages within the usability.

Therefore don't qualify your fields.