Qlik Community

New to Qlik Sense

If you’re new to Qlik Sense, start with this Discussion Board and get up-to-speed quickly.

Announcements
IMPORTANT security patches for GeoAnalytics Server available to download: READ DETAILS
cancel
Showing results for 
Search instead for 
Did you mean: 
jim_chan
Specialist
Specialist

Qlik Sense - Any sample of a extractor qvf file?

HI guys,

Need to ask you guys when doing qlik sense developement.

do you guy shave a practice of creating a extractor layer???

what i mean of extractor layer is - this qvf file will grab more than 20 tables(data table) out from SQL Server, then it will generate a QVD files.

2nd Step - Once you have all the QVD files, you will use it to load into your QVF (dashaboard) file.

I would want to know, anyone has created such extractor file in Qlik Sense?

Rgds,

Jim

5 Replies
petter
Partner
Partner

I do. I have made a quite generic extractor which I named ET1 - Extract and Transform layer 1

The load script code is on Github but the documentation is not there. (I include the full script below as it is less than 100 lines actually). It is purely data driven by tables in SQL Server.

The load script is very minimal and might not be for full production use without some more error handling:

// VERSION: 1.06

// DATE:    2016-05-10

ODBC CONNECT TO MSSQL;

SUB $_Prepare_Extract

  FIELDS:

  SQL SELECT * FROM QLIK_ETL.dbo.VERSIONED_COLUMNS WHERE VERSION=$(version) ORDER BY QT, COLUMN_ID;

  TABLES:

  LOAD * WHERE Exists(SOURCE_TABLE)  ; SQL SELECT * FROM QLIK_ETL.dbo.TABLES;

  [MAP_ADD_COMMENTS]:

  MAPPING LOAD

  FIELD_NAME AS FIELD

  , '__________________________________________' & Chr(10) & '  COMMENT: ' & Chr(10) & '  ' & FIELD_COMMENT & Chr(10) &

  '__________________________________________' & Chr(10) & '  BUSINESS RULE: ' & Chr(10) & '  ' & FIELD_COMMENT & Chr(10) &

     '__________________________________________' & Chr(10) & '  SOURCE: ' & Chr(10) & '  ' & SOURCE_TABLE & '.' & COLUMN_NAME & '.'  & Chr(10)& Chr(10)

   AS COMMENTS

  RESIDENT

  [FIELDS];

  [MAP_NAME_FIELDS]:

  MAPPING LOAD

  [SOURCE_TABLE] & '.' & [COLUMN_NAME], [FIELD_NAME]    /* Capitalize(Replace(Replace(Replace(Replace([ET.Transform_Felt],'_',' '),'AA','Å'),'OE','Ø'),'AE','Æ')) */

  RESIDENT

  [FIELDS] ;

  UNQUALIFY *;

ENDSUB

SUB $_OneDBtableToQVD( DBtable , QlikTable , DB )

  [LOAD and SQL]:

  LOAD

  '[$(QlikTable)]:' & Chr(10)

  & 'LOAD ' & Chr(10) & '    ' & Concat(DISTINCT TYPE & '([' & COLUMN_NAME & ']) AS [' & COLUMN_NAME & ']', ',' & Chr(10) & '    ' , COLUMN_ID) & ' ; ' & Chr(10)

  & 'SQL ' & Chr(10) & '  SELECT $(SELECT_TOP)' & Chr(10) & '    ' & Concat(DISTINCT COLUMN_NAME , ',' & Chr(10) & '    ' , COLUMN_ID) & Chr(10)

  & '  FROM' & Chr(10) & '    $(DB).$(DBtable); ' & Chr(10) AS LOAD_AND_SQL_STATEMENT

  RESIDENT

  [FIELDS]

  WHERE

  SOURCE_TABLE = '$(DBtable)'

  GROUP BY

  SOURCE_TABLE;

  LET loadSQL = loadSQL &

  Peek('LOAD_AND_SQL_STATEMENT',0,'LOAD and SQL') &

  'RENAME FIELDS USING [MAP_NAME_FIELDS];' & Chr(10) &

  'COMMENT FIELDS USING [MAP_ADD_COMMENTS];' & Chr(10) &

  'STORE [$(QlikTable)] INTO "$(QlikTable).QVD" (QVD);' & Chr(10) &

  'IF logging THEN' & Chr(10) &

  '  noOfRows = NoOfRows(''$(QlikTable)'');' & Chr(10) &

  '  now = Date(Now(),''YYYY-MM-DD hh:mm:ss'');' & Chr(10) &

  '  fileSize = FileSize(''$(QlikTable).QVD'');' & Chr(10) &

  '  SQL INSERT INTO QLIK_ETL..LOG VALUES(''$(DBtable)'',''$(QlikTable)'',$' & '(fileSize),''LOADED'',$' & '(noOfRows),''QET1'',''$' & '(batchStart)'',''$' & '(now)'');' & Chr(10) &

  '  noOfRows=; now=; fileSize=;' & Chr(10) &

  'ENDIF' & Chr(10) &

  'DROP TABLE [$(QlikTable)];' & Chr(10) & Chr(10)

  ;

  DROP TABLE [LOAD and SQL];

ENDSUB

SET ErrorMode = 0;

version = 3;

CALL $_Prepare_Extract

logging = True();

maxTableNo = NoOfRows('TABLES');

SELECT_TOP = ' ';   // A LIMIT ON NUMBER OF ROWS RETURNED FROM A SQL SELECT

batchStart = Date(Now(),'YYYY-MM-DD hh:mm:ss');

loadSQL = '';

FOR tableIndex = 0 TO NoOfRows('TABLES')-1

  CALL $_OneDBtableToQVD( Peek('SOURCE_TABLE',tableIndex,'TABLES') , Peek('QLIK_TABLE',tableIndex,'TABLES') , Peek('DB',tableIndex,'TABLES') );

NEXT

$(loadSQL);

loadSQL=;version=;tableIndex=;logging=;maxTableNo=;SELECT_TOP=;batchStart=;   // GARBARGE COLLECTION .... DELETE ALL VARIABLES THAT ARE ONLY LOAD SCRIPT VARIABLIES

The necessary SQL-script to create the supporting tables in SQL Server is attached.

I can provide more info tomorrow with instructions on how to get going with the solution.

I also attached the presentation in PDF from the Qonnections 2016 session where I had a session about the solution.

jim_chan
Specialist
Specialist
Author

wow.... thats so complicated......i cant understand...

Vasiliy_Beshentsev
Creator III
Creator III

Hello, Petter, can u explain us where we can use this code in practice?

Thanks in advance

petter
Partner
Partner

It does work - so it's not necessary to understand the exact code to use it. That is precisely the point. That it can be purely table driven the entire extract process by just maintaining the SQL Server specification tables. It is only necessary to understand the code if you want to extend the functionality not when you add more tables.

I uploaded this just to confirm that there is some practice around this. The documentation is lacking although the presentation gives an idea of the benefits and why it is a good idea to have an approach like this to extracts.

I intend to make better how-to documentation because I already have demo data that demonstrates how it works very well with 36 tables from SQL Server...

Is this what you were looking for?

petter
Partner
Partner

I have been on two projects where I made version 1 in the first project and version 2 in a second project. That was in 2015. So the previous versions were implemented first at a TV-broadcasting and streaming corporation were it went hand-in-hand with a high-quality data warehouse.

The version 2 was implemented at a large pension fund. And the third version of this that I posted here was developed during Q1 2016 and presented at Qonnections 2016.

So the main point is to use this if you have a large number of tables to extract from more than ~20 tables and you want to minimize the load script code you need to create. This has huge benefits in terms of much less script maintenance and much more flexibility. If you read the attached PDF it does go in to why many organisations would benefit from choosing such an approach.