4 Replies Latest reply: Dec 17, 2012 11:19 AM by William Kehoe RSS

    Parameter in expressor function



      Is there a way to use in a datascript function a parameter defined in a configuration file?


      I am trying to achieve the following:

      - I have a data set with a list of countries

      - At run time I would like to filter on a specific list of countries. This list may change each time I run the dataflow.

      - Therfore every time I Iaunch the dataflow, I would like to be able to select the countries I want to filter on




        • Re: Parameter in expressor function
          Mohit Sharma

          you have to go in script and then go to insert menu ----> Load statement-->load from inline-->tools--->document Data--after that which field you want to filter you  do there.....

          hope it might helps you....

          • Re: Parameter in expressor function
            Michael Tarallo

            Hello Lionel - I have requested this functionality to be available in a future release. I have opened a feature request for it and for it to also be supported in the parameters dialog of the QlikView Expressor Connector. If my understanding is correct, you wish to create a named user defined parameter to provide substitutable values for at run-time. Currently parameter support is for operator configuration properties only.


            I will check with our PS team to see if there is "another" creative way to achieve this.




            Mike T

            • Re: Parameter in expressor function

              Currently there are ways to do what you want when using Expressor.


              If you are interested in just storing a few unrelated values, say a string or datetime value, you can use the utility.store_string or utility.store_datetime (each data type has a specific function) functions to store the value in a RDBMS that is embedded within Expressor.  Later, use the utility.retrieve_string, etc function to retrieve the values.


              However, you are asking to store a collection of values, which might contain a different number of entries each time you need to store.  The approach above is probably too cumbersome for this use case.


              There are a couple of approaches you could follow. 


              You could create a simple text file with the list of desired countries.  Then, in the first step of a dataflow, read these countries into a lookup operator.  In a second step of the dataflow, use a lookup rule in a transform operator to test whether a record includes one of these countries.  If it does, the lookup table will return the country name.  If it doesn't, the lookup table will return nil.  In a downstream filter operator, drop the records with a nil value.


              Alternatively, you could write the collection of countries to an external file and then read the file when you run the dataflow.  You could either create this external file manually before running the dataflow, or, if appropriate to your use case, have the dataflow create the file.


              For example, your file might simply contain a listing of countries.

              United States

              United Kindgom





              Then you could use the initialize function in the filter operator to read this list, convert it into a Datascript table, and in a filter operator function rule, iterate through the table, emitting only those records whose country field in included in the listing.  See this knowledge base article for more on Datascript tables.


              To read and write external files, you will need to use the file IO functions of the Lua scripting language that underlies Expressor Datascript.  Look at the documentation at www.lua.org to learn about these functions.

              • Re: Parameter in expressor function
                William Kehoe

                Hi Lionel,


                Starting with QV Expressor v3.8, there is a new global Lua table named expressor.NamedParameters that contains the names and values of all configuration parameters supplied to the dataflow either from the new --configuration option or the new --parameter option  on the etask command line.


                I wrote a simple, read-custom -> write-file dataflow that dumps the contents of the NamedParameters table to a CSV file.  The read-custom operator contains the following code (note that it uses a iterator function as the return value of the "read" function to iterate over the name/value pairs in the table):



                function getNamedParameters()

                   local parameters = {}

                   if type(expressor) ~= 'table' or type(expressor.NamedParameters) ~= 'table' then

                     return parameters



                   for a, v in pairs(expressor.NamedParameters) do

                     local val

                     if type(v) == 'string' or type(v) == 'number' or is.integer(v) or is.decimal(s) then

                       table.insert(parameters, {name = a, value = tostring(v)})



                   return parameters




                function read()

                   local params = getNamedParameters()

                   local pos = 0

                   return function()

                     pos = pos + 1

                     if pos > #params then

                       return nil  -- end of param list


                     return params[pos]




                I have attached a ZIP export file of the project for you to import and try out.  I suggest that you define your list of Country names in Lua table initialization format as such:





                so that you can easily convert this into a Lua table by prepending the keyword "return" to the front of it (to make it a valid Lua 'chunk') and using Lua's loadstring function to dynamically construct the table.  Here's a function (and a short testdriver) that converts a string form of a Lua table initializer to a in-memory Lua table:


                function getCountries(countryList)

                   -- e.g. countryList would be a string like: "{'US','SE','UK'}"

                   local funcbody = "return " .. countryList

                   f, err = loadstring(x)

                   if not f then

                     error('Syntax error in table declaration ' .. (err or ''))



                   local success, t = pcall(f)

                   if not success then

                     error('Failed to evaluate table: ' .. (t or ''))



                   return t



                local c = getCountries(expressor.NamedParameters['countries'])


                -- now 'c' will contain the table of countries

                for i, v in ipairs(c) do





                Please note that the new Lua table expressor.NamedParameters will only be created if the --configuration option is used on the etask command line to specify the file containing the configuration artifact contents (a configuration file is always created for each configuration bound to a dataflow when that dataflow is packaged into a deployment package).  See the etask command line help for the syntax of specify which valueset in a configuration file to use.


                This means that you need to create a configuration artifact in Studio and bind it to the dataflow even if you choose not to specify any config parameters in that configuration artifact and instead choose to use etask's  --parameter option to dynamically set parameter values on the etask command line.


                You should also be aware that, when running dataflows from directly within the dataflow editor of QV Expressor Studio, Studio does NOT specify the --configuration option when launching the dataflow so the NamedParameters table is not created in that case.  But if you package your dataflow into a Deployment Packahe (just create a new deployment package and drag your dataflow into it -- the bound configuration artifact will also be added), then executing the dataflow from inside the deployment package will use the --configuration command line option and the NamedParameters table will contain the names and values of all parameters.


                I have attached a ZIP export file of the project that contains the simple dataflow I created that dumps all of the named parameters to a CSV file.  You can import this project into your own Workspace and test it for yourself just be sure to execute the dataflow from inside the Deployment Package and not from the dataflow editor itself.