Unlock a world of possibilities! Login now and discover the exclusive benefits awaiting you.
Hello,
After a succesfull launch of the Hadoop QVX Writer created by Jasper Knulst. I created a QVX Writer plugin for Pentaho Data Integration.
The public version of this plugin makes use of the demo version of the JAVA QVXConverter library created by Ralf Becher from TIQ Solutions.
How to install the plugin:
Download the demo plugin (mirror)
Extract the content into: <pentaho data integration folder>/plugins/steps/. When extracted there should be a folder inside the 'steps' folder called 'QVXWriterDemo'.
Start Pentaho Data Integration. Create a new transformation (or use an existing one). There should be a new entry in the 'output' steps category called "QVXWriter Demo Plugin".
Usage:
Define a location and file name (.qvx) in the "Output File Location" field. Give the file a tablename in the second field. make sure your columns got the correct datatype (Integer, Number, Date, String). When you are not sure. Use the "Select values" step in Pentaho to manually add or change the meta values of your columns.
Note: This demo plugin is limited to write a maximum of 100.000 rows into a QVX file.
Note 2: On my blog I also use images to show how it should look like.
-
Bram
Another way is to pull data from a Pentaho Kettle transformation's output step right into QlikView:
tiqview.tumblr.com/post/29820190073/stream-data-from-pentaho-kettle-into-qlikview-via-jdbc