4 Replies Latest reply: Dec 22, 2012 5:59 AM by Puneet Lakhanpal RSS

    From_field and 1 qvd per year

      Hello,

      Let's say, I have loaded my fact data with 1 field 'Year'

      For testing purpose, I could use the following script:

      Facts:
      Load 2009 + iterno() as Year, iterno() as idSDFirst AutoGenerate(1)
      while iterno() <10;

      Then I want to build a loop (either For...next or Do...loop) to iterate each year.

      So I need to identify the min and the max of my fact data

      I found this help http://qlikviewmaven.blogspot.com/2009/06/quick-load-of-max-field-value.html but someone is suggestion to use the new feature from QV9: From_Field

      I have tried it, even through the wizard (Field Data), and the syntax retrieved is like this

      LOAD * FROM_FIELD
      (Facts, Year)

      I have not used max() and min() yet; but I always get following message:

      Cannot open file
      LOAD * FROM_FIELD
      (Facts, Year)

      My target is to pass the min and max into variable in order that I build different qvd over the years

      For i = $(v_Min) to $(v_Max);
      if not isnull(QVDCreateTime('Facts_$(i).qvd')) then
      Concatenate
      LOAD * FROM Facts_$(i).QVD (qvd)
      WHERE NOT(Exists (idSD));
      ENDIF
      STORE * from Facts where OpenYear = '$(i)' INTO Facts_$(i).QVD
      ;
      next
      ;

      1st problem: How to get min(Year) and max(Year) into variable

      2nd problem: The STORE statement does not seem to allow a WHERE clause. Is it true?

      Thanks for your help!

       

        • SV:From_field and 1 qvd per year
          Goran Korsgren

          Solution for 1:

          MaxMinYear:
          Load Min(Year) As MinYear,
          Max(Year) As MaxYear
          Resident Facts;

          Let v_Min=peek('MinYear');
          Let v_Max=peek('MaxYear');

          Drop Table MaxMinYear;

           

          Answer for 2:

          Yes it's true (as far as i know...) There is no "where" clause in the Store statement.

            • SV:From_field and 1 qvd per year

              Hi Göran,

              Problem with the 'RESIDENT' statement is that you are re-reading the whole table, and if it is millions of records, you might want to avoid. There is one solution in the blog I have mentioned, and another one using the 'FROM_FIELD' statement. Problem is that I can't find any example on the web.

              How do we use it?! I want to see its benefits. Thanks in advance,

              Too bad that the 'STORE' statement does not accept any where clause. There are example to split qvd by year, but they don't show us the code to produce those qvd. And it does not make sense that I re-execute my SQL query year by year...

               

                • SV:Re: SV:From_field and 1 qvd per year
                  Goran Korsgren

                  Hi

                  Well in that case you do the "Load FiedValue"-trick as described in the blog post you linked to.

                   

                  I have never used the FROM_FIELD feature, so I can't help you with that.

                   

                  If you want to STORE part (not all records) of a QV Table you need to create a new QV Table which just those records,

                  for example like this

                  Facts2010:
                  Load *,
                  1 As dummy // this is to make Facts2010 different from Facts
                  Resident Facts
                  Where Year=2010;

                  Drop Field dummy;
                  Store Facts2010 Into Facts2010.qvd;

                  Drop Table Facts2010;

                   

                  • Re: SV:From_field and 1 qvd per year

                    Hi Nicolas,

                     

                    Did you get a solution to your problem using from_field ? I am also stuck in a similar situation where FieldX exists in Table A and FieldX also exists in Table B and both are associated via FieldX. However, I need to find min and max of FieldX only in Table A. Since the size of Table A is very big, it takes a lot of time to scan through all content. Therefore, please share your thoughts if you got a resolution to this problem.