Skip to main content
Announcements
Have questions about Qlik Connect? Join us live on April 10th, at 11 AM ET: SIGN UP NOW
cancel
Showing results for 
Search instead for 
Did you mean: 
Not applicable

Which is the best suited ETL tool to feed QlikView Applications?

I know that couple of years back QlikTech acquired the ETL tool Expressor (along with Micheal Trallo ). Still i would like to know which is the best suited ETL tool to go well with QlikView? Possible, please provide me lists of ETL tools doing good with QlikView. Thanks.

Note: Data source could be Oracle, DB2 and/or SQL Server. Big Data is future focused.

Regard

-Vijay

9 Replies
Anonymous
Not applicable
Author

How about simply doing the ETL in the QlikView script ?

Not applicable
Author

Hi Bill

ETL in QlikView scripting is simple and straightforward approach, may be good enough for specific department or functions, but not for cross-functional application where we need to extract data from multiple sources, and to be used in multiple BI applications.

Expressor is one way but bounded only with QlikView and it's not supporting other BI tools like Tableau or Spotfire.

Sometimes customers wanted to adopt QlikView as a front-end application which needs to extract data from existing ETL tools like Informatica or Data Stage.

It would be great support from QlikTech if i comes to know the list of ETL tools supported addition to direct data extraction through scripting.

Regards,

-Vijay

Not applicable
Author

Hi,

according to my experience, "any" ETL tool can be used with Qlikview, and choice depend on many other factors (DSI directives, budget, competencies...).

More precisely :

  • you use differents data sources (data bases, flat files...)
  • ETL tool take this sources, transform them and store into one (or more... ?) targets (ODS, DWH, datamart...)
  • finally QV extracts this datas in order to put them into your dashboards (with or without QVD step)

Some ETL tools i used :

  • SSIS (works perfectly with SQL Server)
  • AbInitio (expansive but very efficient in order to process a lot of data)
  • BO Data Integrator (less functions but very easy to use)
  • Talend (exist free version)
evan_kurowski
Specialist
Specialist

When I look up excerpts from 'ETL' on Wikipedia http://en.wikipedia.org/wiki/Extract,_transform,_load

Extract:
"The first part of an ETL process involves extracting the data from the source systems. In many cases this is the most challenging aspect of ETL, since extracting data correctly sets the stage for the success of subsequent processes."

"An intrinsic part of the extraction involves the parsing of extracted data, resulting in a check of whether the data meet expected patterns or structures. If not, the data may be rejected entirely or in part."

Transform: 
1 Selecting only certain columns to load
2 Translating coded values (e.g., if the source system stores 1 for male and 2 for female, but the warehouse stores M for male and F for female)
3 Encoding free-form values (e.g., mapping "Male" to "M")
4 Deriving a new calculated value (e.g., sale_amount = qty * unit_price)
5 Sorting
6 Joining data from multiple sources (e.g., lookup, merge) and deduplicating the data
7 Aggregation (for example, rollup — summarizing multiple rows of data — total sales for each store, and for each region, etc.)
8 Generating surrogate-key values
9 Transposing or pivoting (turning multiple columns into multiple rows or vice versa)
10 Splitting a column into multiple columns (e.g., converting a comma-separated list, specified as a string in one column, into individual values in different columns)
11 Disaggregation of repeating columns into a separate detail table (e.g., moving a series of addresses in one record into single addresses in a set of records in a linked address table)
12 Lookup and validate the relevant data from tables or referential files for slowly changing dimensions.
13 Applying any form of simple or complex data validation. If validation fails, it may result in a full, partial or no rejection of the data, and thus none, some or all the data are handed over to the next step, depending on the rule design and exception handling. Many of the above transformations may result in exceptions, for example, when a code translation parses an unknown code in the extracted data.

Load:
The load phase loads the data into the end target, usually the data warehouse (DW). Depending on the requirements of the organization, this process varies widely. Some data warehouses may overwrite existing information with cumulative information; updating extracted data is frequently done on a daily, weekly, or monthly basis. Other data warehouses (or even other parts of the same data warehouse) may add new data in an historical form at regular intervals—for example, hourly.

To understand this, consider a data warehouse that is required to maintain sales records of the last year. This data warehouse overwrites any data older than a year with newer data. However, the entry of data for any one year window is made in a historical manner. The timing and scope to replace or append are strategic design choices dependent on the time available and the business needs. More complex systems can maintain a history and audit trail of all changes to the data loaded in the data warehouse. As the load phase interacts with a database, the constraints defined in the database schema — as well as in triggers activated upon data load — apply (for example, uniqueness, referential integrity, mandatory fields), which also contribute to the overall data quality performance of the ETL process.

For example, a financial institution might have information on a customer in several departments and each department might have that customer's information listed in a different way. The membership department might list the customer by name, whereas the accounting department might list the customer by number. ETL can bundle all of these data and consolidate them into a uniform presentation, such as for storing in a database or data warehouse.

Another way that companies use ETL is to move information to another application permanently. For instance, the new application might use another database vendor and most likely a very different database schema. ETL can be used to transform the data into a format suitable for the new application to use. An example of this would be an Expense and Cost Recovery System (ECRS) such as used by accountancies, consultancies and lawyers. The data usually end up in the time and billing system, although some businesses may also utilize the raw data for employee productivity reports to Human Resources (personnel dept.) or equipment usage reports to Facilities Management.

fkeuroglian
Partner - Master
Partner - Master

Qlikview has his own etl tool

Script

Good luck

Fernando

ashfaq_haseeb
Champion III
Champion III

Hi,

As stated above any ETL tool can work with Qlik.

You can even have a look at Expressor ETL product of Qlik.

Regards

ASHFAQ

arish_delon
Creator
Creator

Hi Vijay, We user Oracle Work Bench. But this tools is very systematic. Have to follow to the letter.

If your team like more free flowing and customization check the following link.

http://www.etl-tools.com/qlikview-connector/overview.html

This tool has lots of promises.

Good Luck,

Arish

Not applicable
Author

Hi,

My suggestion would be Talend as this is an open source ETL and there are plenty of components as all expected to do ETL process.

Best,

Robert

Nancy_Jain
Partner - Contributor II
Partner - Contributor II

Hi , 

How we can load data from ab initio into qlikview.

It will be a great help if you ave worked on the same.

 

Thanks 

Nancy