<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Re: Occasionally missing records from import in QlikView</title>
    <link>https://community.qlik.com/t5/QlikView/Occasionally-missing-records-from-import/m-p/1774098#M591174</link>
    <description>&lt;P&gt;Hello Marcus,&lt;BR /&gt;I am sorry that I have confused you.&amp;nbsp;&amp;nbsp;&lt;BR /&gt;The script is indeed only to verify that there are missing records.&amp;nbsp; I just wanted to be sure that my manual comparison between MySQL database and Qlik QVD gives the same result of missing records as with this automatically check.&lt;/P&gt;&lt;P&gt;But this is not the problem. Main problem is missing records when importing from MySQL into QVD.&lt;BR /&gt;&lt;BR /&gt;See next post with a part of the script explanation.&lt;/P&gt;&lt;P&gt;Regards, Jan-Arie&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
    <pubDate>Wed, 13 Jan 2021 14:58:53 GMT</pubDate>
    <dc:creator>Jan-Arie</dc:creator>
    <dc:date>2021-01-13T14:58:53Z</dc:date>
    <item>
      <title>Occasionally missing records from import</title>
      <link>https://community.qlik.com/t5/QlikView/Occasionally-missing-records-from-import/m-p/1773611#M591168</link>
      <description>&lt;P&gt;Hello,&amp;nbsp; I'm new to Qlikview and &lt;SPAN class="VIiyi"&gt;&lt;SPAN class="JLqJ4b ChMk0b"&gt;&lt;SPAN&gt;took over the project from an ex colleague&lt;/SPAN&gt;&lt;/SPAN&gt;&lt;/SPAN&gt;.&lt;/P&gt;&lt;P&gt;The problem is that Qlik is very occasionally missing records after importing from a database.&lt;/P&gt;&lt;P&gt;The project: Every 5 minutes external devices from our customers are putting records into a MySQL database.&lt;BR /&gt;Qlik runs every 30 minutes and imports and analyzing &lt;SPAN class="VIiyi"&gt;&lt;SPAN class="JLqJ4b ChMk0b"&gt;&lt;SPAN&gt;the most recently added records from that MySQL database.&amp;nbsp; In the MySQL table every record has an unique auto-increment id and is used as the pointer in Qlik-qvw from where to fetch al the new records.&lt;BR /&gt;In Qlik there are several queries to fetch these data and with OUTER JOIN/LOAD DISTINCT combined to 1 table. And at the end of the script combined with the already existing qlik QVD table to form 1 new complete QVD table.&lt;BR /&gt;&lt;BR /&gt;This works good, but very occasionally Qlik is missing records during a certain period.&lt;BR /&gt;Sometimes a complete week is correctly imported and then Qlik misses records from a time frame of 5 till 30 minutes.&lt;BR /&gt;Complete randomly, sometimes multiple times a day, sometimes only a few records of a 5 minute time frame.&lt;BR /&gt;&lt;BR /&gt;When checking the log files and execute the SQL query by myself on the MySQL database the missing records are just included in the output.&lt;BR /&gt;&lt;BR /&gt;&lt;/SPAN&gt;&lt;/SPAN&gt;&lt;/SPAN&gt;&lt;SPAN class="VIiyi"&gt;&lt;SPAN class="JLqJ4b ChMk0b"&gt;&lt;SPAN&gt;(To get an idea how many missing records: last 20 days with ~70.000 records:&amp;nbsp; 350 records missing)&lt;/SPAN&gt;&lt;/SPAN&gt;&lt;/SPAN&gt;&lt;/P&gt;&lt;P&gt;&lt;SPAN class="VIiyi"&gt;&lt;SPAN class="JLqJ4b ChMk0b"&gt;&lt;SPAN&gt;Why is Qlik missing records?&lt;/SPAN&gt;&lt;/SPAN&gt;&lt;/SPAN&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Tue, 12 Jan 2021 08:00:44 GMT</pubDate>
      <guid>https://community.qlik.com/t5/QlikView/Occasionally-missing-records-from-import/m-p/1773611#M591168</guid>
      <dc:creator>Jan-Arie</dc:creator>
      <dc:date>2021-01-12T08:00:44Z</dc:date>
    </item>
    <item>
      <title>Re: Occasionally missing records from import</title>
      <link>https://community.qlik.com/t5/QlikView/Occasionally-missing-records-from-import/m-p/1773942#M591169</link>
      <description>&lt;P&gt;Maybe those records are not unique and removed through the distinct measure and/or the records aren't complete/invalid in some ways and deleted from any where-clause or join.&lt;/P&gt;&lt;P&gt;More likely as the above is that those records are locked from the db to the query-time and therefore not transferred. This might be bypassed through a read-only access whereby db and driver would need to support this feature.&lt;/P&gt;&lt;P&gt;- Marcus&lt;/P&gt;</description>
      <pubDate>Wed, 13 Jan 2021 08:31:16 GMT</pubDate>
      <guid>https://community.qlik.com/t5/QlikView/Occasionally-missing-records-from-import/m-p/1773942#M591169</guid>
      <dc:creator>marcus_sommer</dc:creator>
      <dc:date>2021-01-13T08:31:16Z</dc:date>
    </item>
    <item>
      <title>Re: Occasionally missing records from import</title>
      <link>https://community.qlik.com/t5/QlikView/Occasionally-missing-records-from-import/m-p/1773982#M591170</link>
      <description>&lt;P&gt;Hello Marcus,&lt;/P&gt;&lt;P&gt;Thanks for your response.&lt;BR /&gt;The records in the source database are unique and complete.&amp;nbsp; They are "equal" to the other records, I mean no missing,empty or malformed fields and with correct data into the fields.&lt;BR /&gt;The MySQL database/table is only used by the import application with INSERT queries. And Qlik only does a SELECT on this table.&lt;BR /&gt;But I will check if there is some issues with locking.&lt;BR /&gt;The table contains already &amp;gt;60 million records, but I presume that that cannot be an issue.&lt;BR /&gt;&lt;BR /&gt;And when I execute the queries, which Qlik uses, I will get the correct records.&amp;nbsp; So joining in Qlik script is maybe an issue, but then I can't imagine why it will work for weeks and then sometimes not.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Wed, 13 Jan 2021 10:26:21 GMT</pubDate>
      <guid>https://community.qlik.com/t5/QlikView/Occasionally-missing-records-from-import/m-p/1773982#M591170</guid>
      <dc:creator>Jan-Arie</dc:creator>
      <dc:date>2021-01-13T10:26:21Z</dc:date>
    </item>
    <item>
      <title>Re: Occasionally missing records from import</title>
      <link>https://community.qlik.com/t5/QlikView/Occasionally-missing-records-from-import/m-p/1773996#M591171</link>
      <description>&lt;P&gt;It might be helpful to compare the records of the db and those which are loaded in Qlik. Means you need an additionally query on the db maybe with something like this:&amp;nbsp;select count(*) from &amp;lt;tablename&amp;gt; or by querying the db-metadata or per max(&lt;SPAN&gt;auto-increment id&lt;/SPAN&gt;) or similar measures. In Qlik you could use: noofrows('tablename').&lt;/P&gt;&lt;P&gt;If there are any differences you could be alert and/or trying to identify when which records are missing - to isolate the nearest timestamp and by which records it happens may give some hints to the cause.&lt;/P&gt;&lt;P&gt;- Marcus&lt;/P&gt;</description>
      <pubDate>Wed, 13 Jan 2021 11:07:08 GMT</pubDate>
      <guid>https://community.qlik.com/t5/QlikView/Occasionally-missing-records-from-import/m-p/1773996#M591171</guid>
      <dc:creator>marcus_sommer</dc:creator>
      <dc:date>2021-01-13T11:07:08Z</dc:date>
    </item>
    <item>
      <title>Re: Occasionally missing records from import</title>
      <link>https://community.qlik.com/t5/QlikView/Occasionally-missing-records-from-import/m-p/1774008#M591172</link>
      <description>&lt;P&gt;Hello Marcus,&lt;/P&gt;&lt;P&gt;Yes, I have done that.&amp;nbsp; First I have done that with a SELECT query in MySQL and store that result. Then I converted the QVD data file and compare it with the MySQL resultset.&amp;nbsp;&lt;/P&gt;&lt;P&gt;Now I have made a Qlik script which does it directly (for two Tags in this case and from a certain date):&lt;/P&gt;&lt;P&gt;TRACE &amp;gt;&amp;gt;&amp;gt;&amp;gt;&amp;gt; Start data comparison;&lt;BR /&gt;TableESyncTagsHistory:&lt;BR /&gt;TRACE &amp;gt;&amp;gt;&amp;gt;&amp;gt;&amp;gt; Fetch ESync data&lt;/P&gt;&lt;P&gt;LOAD DISTINCT&lt;BR /&gt;Id as Id,&lt;BR /&gt;TagId as TagId,&lt;BR /&gt;EsyncTimeStamp as EsyncTimeStamp;&lt;BR /&gt;SQL SELECT&lt;BR /&gt;a.id as Id, a.TagId,&lt;BR /&gt;a._date as EsyncTimeStamp&lt;BR /&gt;from&lt;BR /&gt;esync.`esync_tagshistory` a&lt;BR /&gt;where&lt;BR /&gt;a._date &amp;gt; '2020-12-17'&lt;BR /&gt;and ( a.TagId = '5532' or a.TagId = '5533' )&lt;BR /&gt;ORDER BY&lt;BR /&gt;a.TagId, a._date;&lt;/P&gt;&lt;P&gt;&lt;BR /&gt;TRACE &amp;gt;&amp;gt;&amp;gt;&amp;gt;&amp;gt; Fetch Qlikview data;&lt;BR /&gt;TableProcessData:&lt;BR /&gt;LOAD DISTINCT&lt;BR /&gt;EsyncTagsHistoryID as Id,&lt;BR /&gt;TagId as TagId,&lt;BR /&gt;TimeStamp as QlikTimeStamp&lt;BR /&gt;FROM [D:\temp\qvd\qProcessdata.qvd](qvd)&lt;BR /&gt;where&lt;BR /&gt;TimeStamp &amp;gt; '2020-12-17'&lt;BR /&gt;AND ( TagId = 5532 OR TagId = 5533&amp;nbsp; )&lt;BR /&gt;;&lt;/P&gt;&lt;P&gt;TRACE &amp;gt;&amp;gt;&amp;gt;&amp;gt;&amp;gt; Store tables for debugging;&lt;BR /&gt;STORE TableESyncTagsHistory INTO D:\temp\qvd\qTableESyncTagsHistory.qvd;&lt;BR /&gt;STORE TableProcessData INTO D:\temp\qvd\qTableProcessData.qvd;&lt;/P&gt;&lt;P&gt;DROP TABLE TableESyncTagsHistory;&lt;BR /&gt;DROP TABLE TableProcessData;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;// Merge the two tables, to be sure that it is not in memory, read it from the files&lt;BR /&gt;TRACE &amp;gt;&amp;gt;&amp;gt;&amp;gt;&amp;gt; Merge data;&lt;/P&gt;&lt;P&gt;TableMergeData:&lt;BR /&gt;TRACE &amp;gt;&amp;gt;&amp;gt;&amp;gt;&amp;gt; Load Esync table;&lt;BR /&gt;LOAD * FROM [D:\temp\qvd\qTableESyncTagsHistory.qvd](qvd);&lt;/P&gt;&lt;P&gt;// TRACE &amp;gt;&amp;gt;&amp;gt;&amp;gt;&amp;gt; Load Qlik table;&lt;BR /&gt;JOIN LOAD * FROM [D:\temp\qvd\qTableProcessData.qvd](qvd);&lt;/P&gt;&lt;P&gt;TRACE &amp;gt;&amp;gt;&amp;gt;&amp;gt;&amp;gt; Store merge table for debugging;&lt;BR /&gt;STORE * FROM TableMergeData INTO D:\temp\qvd\qTableMergeData.qvd;&lt;/P&gt;&lt;P&gt;TRACE &amp;gt;&amp;gt;&amp;gt;&amp;gt;&amp;gt; Drop temporary tables;TRACE &amp;gt;&amp;gt;&amp;gt;&amp;gt;&amp;gt; FINISHED;&lt;BR /&gt;// end script&lt;/P&gt;&lt;P&gt;When analyzing now the qTableMergeData.QVD file, the missing records in Qlik doesn't have the QlikTimeStamp filled in.&lt;/P&gt;&lt;P&gt;&lt;span class="lia-inline-image-display-wrapper lia-image-align-left" image-alt="Missing records" style="width: 200px;"&gt;&lt;img src="https://community.qlik.com/t5/image/serverpage/image-id/46889iA704563E4AFE6293/image-size/small?v=v2&amp;amp;px=200" role="button" title="missingrecords.png" alt="Missing records" /&gt;&lt;span class="lia-inline-image-caption" onclick="event.preventDefault();"&gt;Missing records&lt;/span&gt;&lt;/span&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;But this more of a consequence conclusion that records are missing. And not the solution to prevent it.&lt;/P&gt;</description>
      <pubDate>Wed, 13 Jan 2021 12:03:20 GMT</pubDate>
      <guid>https://community.qlik.com/t5/QlikView/Occasionally-missing-records-from-import/m-p/1774008#M591172</guid>
      <dc:creator>Jan-Arie</dc:creator>
      <dc:date>2021-01-13T12:03:20Z</dc:date>
    </item>
    <item>
      <title>Re: Occasionally missing records from import</title>
      <link>https://community.qlik.com/t5/QlikView/Occasionally-missing-records-from-import/m-p/1774092#M591173</link>
      <description>&lt;P&gt;For me it looked that you compares a current sql load with historical qvd data because the qTableProcessData.qvd exists already and must be loaded/created earlier. I thought more in direction like:&lt;/P&gt;&lt;P&gt;#records: sql select #records from db.systemtable where table = 'YourTable';&lt;/P&gt;&lt;P&gt;or maybe:&lt;/P&gt;&lt;P&gt;#records: sql select count(*) as #records from db.YourTable;&lt;/P&gt;&lt;P&gt;t: load fields …; sql select fields … from db.YourTable;&lt;BR /&gt;store t into t.qvd (qvd);&lt;/P&gt;&lt;P&gt;let check1 = noofrows('t') = peek('#records', 0, '#records');&lt;BR /&gt;let check2 = qvdnoofrecords('t') = peek('#records', 0, '#records');&lt;/P&gt;&lt;P&gt;In a second step you may include also timestamp-information especially if the queries take some time.&lt;/P&gt;&lt;P&gt;- Marcus&lt;/P&gt;</description>
      <pubDate>Wed, 13 Jan 2021 14:36:16 GMT</pubDate>
      <guid>https://community.qlik.com/t5/QlikView/Occasionally-missing-records-from-import/m-p/1774092#M591173</guid>
      <dc:creator>marcus_sommer</dc:creator>
      <dc:date>2021-01-13T14:36:16Z</dc:date>
    </item>
    <item>
      <title>Re: Occasionally missing records from import</title>
      <link>https://community.qlik.com/t5/QlikView/Occasionally-missing-records-from-import/m-p/1774098#M591174</link>
      <description>&lt;P&gt;Hello Marcus,&lt;BR /&gt;I am sorry that I have confused you.&amp;nbsp;&amp;nbsp;&lt;BR /&gt;The script is indeed only to verify that there are missing records.&amp;nbsp; I just wanted to be sure that my manual comparison between MySQL database and Qlik QVD gives the same result of missing records as with this automatically check.&lt;/P&gt;&lt;P&gt;But this is not the problem. Main problem is missing records when importing from MySQL into QVD.&lt;BR /&gt;&lt;BR /&gt;See next post with a part of the script explanation.&lt;/P&gt;&lt;P&gt;Regards, Jan-Arie&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Wed, 13 Jan 2021 14:58:53 GMT</pubDate>
      <guid>https://community.qlik.com/t5/QlikView/Occasionally-missing-records-from-import/m-p/1774098#M591174</guid>
      <dc:creator>Jan-Arie</dc:creator>
      <dc:date>2021-01-13T14:58:53Z</dc:date>
    </item>
    <item>
      <title>Re: Occasionally missing records from import</title>
      <link>https://community.qlik.com/t5/QlikView/Occasionally-missing-records-from-import/m-p/1774109#M591175</link>
      <description>&lt;P&gt;This is (a part of) the script:&lt;BR /&gt;On other tabs are other queries to fetch different sets of data&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;// Select which stations must be read&amp;nbsp; (different customers, but only need 2 of them)&lt;BR /&gt;SET IdCustomer="(17,21)";&lt;/P&gt;&lt;P&gt;TRACE &amp;gt;&amp;gt;&amp;gt;&amp;gt;&amp;gt; Load all seperate process data into TempProcessData;&lt;/P&gt;&lt;P&gt;//Force dataformat for eSync database&lt;BR /&gt;let vLastTimeStamp = date(vLastTimeStamp,'YYYY-MM-DD hh:mm:ss');&lt;/P&gt;&lt;P&gt;TempProcessData:&lt;BR /&gt;TRACE &amp;gt;&amp;gt;&amp;gt;&amp;gt;&amp;gt; Load TempProcessData with Capacity;&lt;BR /&gt;TRACE &amp;gt;&amp;gt;&amp;gt;&amp;gt;&amp;gt; Calculate the production data for the counters;&lt;BR /&gt;&lt;BR /&gt;LOAD DISTINCT&lt;BR /&gt;TagId,&lt;BR /&gt;id as EsyncTagsHistoryID,&lt;BR /&gt;ApplyMap('MapMachinePart',TagId) as MachinePart,&lt;BR /&gt;ApplyMap('MapMachineNumber',TagId) as MachineNumber,&lt;BR /&gt;ApplyMap('MapLineNumber',TagId) as LineNumber,&lt;BR /&gt;ApplyMap('MapTrackNumber',TagId) as TrackNumber,&lt;BR /&gt;Val as Counter_Val,&lt;BR /&gt;Production as Production_Val,&lt;BR /&gt;_date as TimeStamp,&lt;BR /&gt;StationId;&lt;BR /&gt;SQL SELECT b.TagId,&lt;BR /&gt;b.id, b._date,&amp;nbsp; a.StationId, b.Val,&lt;BR /&gt;if(@lastTag = b.TagId,&lt;BR /&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp; if(b.Val - @lastValue&amp;gt;=0,&lt;BR /&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp; &amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp; b.Val - @lastValue,&lt;BR /&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp; &amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp; b.Val),&lt;BR /&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp; if(b.Val - (SELECT @prevVal := c.Val from esync_tagshistory c&lt;BR /&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp; &amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp; where c.id &amp;lt; b.id&lt;BR /&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp; &amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp; and c.TagId = b.TagId&lt;BR /&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp; &amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp; and c.`_date` &amp;gt; (now() - interval 48 hour )&lt;BR /&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp; &amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp; order by c.id desc limit 1) &amp;gt;= 0,&lt;BR /&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp; &amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp; b.Val - @prevVal,&lt;BR /&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp; &amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp; 0&lt;BR /&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp; )&lt;BR /&gt;) as Production,&lt;BR /&gt;@lastTag := b.TagId,&lt;BR /&gt;@lastValue := b.Val,&lt;BR /&gt;@lastID := b.id&lt;BR /&gt;FROM (SELECT * FROM esync.`esync_tags` WHERE Name LIKE "&lt;STRONG&gt;%_Counter&lt;/STRONG&gt;" AND StationId IN $(IdCustomer)) a,&lt;BR /&gt;esync.`esync_tagshistory` b ,&lt;BR /&gt;(SELECT @lastTag := 0, @lastValue := 0, @lastID := 0, @prevVal := 0) SQLVars&lt;BR /&gt;WHERE a.Id = b.TagId AND b.id &amp;gt; '$(vTagsHistoryID)'&lt;BR /&gt;ORDER BY b.TagId, b._date;&lt;/P&gt;&lt;P&gt;//&amp;nbsp; ========&amp;gt; The following section with "&lt;STRONG&gt;%_Capacity&lt;/STRONG&gt;" will be repeated several times, but then the&lt;BR /&gt;// name will be "%_MaxCapacity" or "%_CrateType" or ...&lt;/P&gt;&lt;P&gt;&lt;BR /&gt;OUTER JOIN (TempProcessData)&lt;BR /&gt;LOAD DISTINCT&lt;BR /&gt;TagId,&lt;BR /&gt;id as EsyncTagsHistoryID,&lt;BR /&gt;ApplyMap('MapMachinePart',TagId) as MachinePart,&lt;BR /&gt;ApplyMap('MapMachineNumber',TagId) as MachineNumber,&lt;BR /&gt;ApplyMap('MapLineNumber',TagId) as LineNumber,&lt;BR /&gt;ApplyMap('MapTrackNumber',TagId) as TrackNumber,&lt;BR /&gt;Val as Capacity_Val,&lt;BR /&gt;_date as TimeStamp,&lt;BR /&gt;StationId;&lt;BR /&gt;SQL SELECT TagId, b.id, _date, b.Val, a.StationId FROM (SELECT * FROM esync.`esync_tags` WHERE Name LIKE "&lt;STRONG&gt;%_Capacity&lt;/STRONG&gt;" AND StationId IN $(IdCustomer)) a, esync.`esync_tagshistory` b WHERE a.Id = b.TagId AND b.id &amp;gt; $(vTagsHistoryID);&lt;/P&gt;&lt;P&gt;//&amp;nbsp; ========&amp;gt; End of repeated section&lt;/P&gt;&lt;P&gt;// Only for test:&amp;nbsp; (files are stored apart so that I can analyze these files)&lt;BR /&gt;TRACE &amp;gt;&amp;gt;&amp;gt;&amp;gt;&amp;gt; Store TempProcessdata into qTempProcessData.qvd;&lt;BR /&gt;STORE TempProcessData INTO $(vQVDpath)qTempProcessdata.qvd;&lt;/P&gt;&lt;P&gt;TRACE &amp;gt;&amp;gt;&amp;gt;&amp;gt;&amp;gt; Load existing Processdata and concatenate new data from TempProcessData;&lt;/P&gt;&lt;P&gt;OrderProcessData:&lt;BR /&gt;LOAD * FROM $(vQVDpath)qProcessdata.qvd (qvd);&lt;BR /&gt;CONCATENATE LOAD&lt;BR /&gt;StationId, TagId, EsyncTagsHistoryID, MachinePart, MachineNumber, LineNumber, TrackNumber, year(TimeStamp) as SampleP_Year, month(TimeStamp) as SampleP_Month,&amp;nbsp; week(TimeStamp) as SampleP_Week, weekday(TimeStamp) as SampleP_Weekday, day(TimeStamp) as SampleP_Day,&lt;BR /&gt;hour(TimeStamp) as SampleP_Hour, TimeStamp,&lt;BR /&gt;Autostart_Val, ProductionDuration_Val, Counter_Val, Production_Val, Capacity_Val, MaxCapacity_Val,&lt;BR /&gt;CrateType_Val, DowntimeHour_Val,&lt;BR /&gt;TempS1_Val, TempS2_Val, TempS3_Val, TempS4_Val, TempS5_Val, LevelS1_Val, LevelZ1_Val, LevelS2_Val,&lt;BR /&gt;LevelZ2_Val, LevelS3_Val, LevelZ3_Val, LevelS4_Val, LevelZ4_Val, LevelS5_Val, LevelZ5_Val&lt;BR /&gt;RESIDENT TempProcessData;&lt;/P&gt;&lt;P&gt;ProcessData:&lt;BR /&gt;NOCONCATENATE LOAD * RESIDENT OrderProcessData&lt;BR /&gt;ORDER BY TagId, TimeStamp;&lt;BR /&gt;DROP TABLE OrderProcessData;&lt;/P&gt;&lt;P&gt;TRACE &amp;gt;&amp;gt;&amp;gt;&amp;gt;&amp;gt; Store Processdata;&lt;BR /&gt;STORE * FROM ProcessData INTO $(vQVDpath)qProcessdata.qvd;&lt;BR /&gt;DROP TABLE ProcessData;&lt;/P&gt;&lt;P&gt;// &lt;EM&gt;Determine now the latest/highest record ID fetched in this run. And use that as start pointer the next time.&lt;/EM&gt;&lt;BR /&gt;TRACE &amp;gt;&amp;gt;&amp;gt;&amp;gt;&amp;gt; used Tags historyID: $(vTagsHistoryID);&lt;BR /&gt;LastEsyncTagsHistoryID:&lt;BR /&gt;LOAD&lt;BR /&gt;MAX(EsyncTagsHistoryID) as EsyncTagsHistoryID&lt;BR /&gt;FROM $(vQVDpath)qProcessdata.qvd (qvd);&lt;BR /&gt;LET vTagsHistoryID = peek('EsyncTagsHistoryID',0,'vTagsHistoryID');&lt;BR /&gt;STORE LastEsyncTagsHistoryID INTO $(vQVDpath)qTagsHistoryID.qvd;&lt;/P&gt;&lt;P&gt;TRACE &amp;gt;&amp;gt;&amp;gt;&amp;gt;&amp;gt; Next Tags historyID: $(vTagsHistoryID);&lt;BR /&gt;DROP TABLE LastEsyncTagsHistoryID;&lt;/P&gt;&lt;P&gt;// End of the script&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Wed, 13 Jan 2021 15:23:05 GMT</pubDate>
      <guid>https://community.qlik.com/t5/QlikView/Occasionally-missing-records-from-import/m-p/1774109#M591175</guid>
      <dc:creator>Jan-Arie</dc:creator>
      <dc:date>2021-01-13T15:23:05Z</dc:date>
    </item>
    <item>
      <title>Re: Occasionally missing records from import</title>
      <link>https://community.qlik.com/t5/QlikView/Occasionally-missing-records-from-import/m-p/1774136#M591176</link>
      <description>&lt;P&gt;I don't see that there is an obvious problem with the script. Probably not related to your issue but sometimes the cause of problems is the use of variables because they could contain unexpected values. This means in cases in which it comes to strange results the variables should be monitored.&lt;/P&gt;&lt;P&gt;Beside this I think you need really to check the number of records in each step. Often it's also helpful to include recno() and rowno() within the Qlik loadings.&lt;/P&gt;&lt;P&gt;Like mentioned the most likely cause is a locking within the database. Maybe you could enforce such locking by an intentionally access from you and some colleagues on some of the records - if those then are missing you found the cause without the more or less expensive checks for number of records in all those loadings.&lt;/P&gt;&lt;P&gt;- Marcus&lt;/P&gt;</description>
      <pubDate>Wed, 13 Jan 2021 16:45:26 GMT</pubDate>
      <guid>https://community.qlik.com/t5/QlikView/Occasionally-missing-records-from-import/m-p/1774136#M591176</guid>
      <dc:creator>marcus_sommer</dc:creator>
      <dc:date>2021-01-13T16:45:26Z</dc:date>
    </item>
    <item>
      <title>Re: Occasionally missing records from import</title>
      <link>https://community.qlik.com/t5/QlikView/Occasionally-missing-records-from-import/m-p/1774264#M591177</link>
      <description>&lt;P&gt;Hi Marcus,&lt;/P&gt;&lt;P&gt;I am going to add logging of all variables and counters. Then I can see at which point there are differences.&lt;BR /&gt;Thanks for your help.&lt;BR /&gt;&lt;BR /&gt;Jan-Arie&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Thu, 14 Jan 2021 07:03:45 GMT</pubDate>
      <guid>https://community.qlik.com/t5/QlikView/Occasionally-missing-records-from-import/m-p/1774264#M591177</guid>
      <dc:creator>Jan-Arie</dc:creator>
      <dc:date>2021-01-14T07:03:45Z</dc:date>
    </item>
    <item>
      <title>Re: Occasionally missing records from import</title>
      <link>https://community.qlik.com/t5/QlikView/Occasionally-missing-records-from-import/m-p/1774291#M591178</link>
      <description>&lt;P&gt;Your sql isn't extremely complex but it's also not a simple pull from a single table. The sql contains among other things several sub-queries and various tables. Therefore it could become quite difficult to detect any locking or to ensure that really all relevant tables have complete and valid data - means there might be already all keys/data within the main-table but not all conditions might be fulfilled at this time and therefore some records might be excluded.&lt;/P&gt;&lt;P&gt;In this case there is no error and really missing records (it happens like it should) else just the point of time of the data-extract is different and you did compare apple with oranges. To exclude any possible error by loading the data into Qlik you could create with your query a temporary table(s) within your database and/or exporting it into an external csv-file maybe just adding a timestamp to the table/filename and then comparing the number of records from this table against the qvd and against a current query in the db with the previous conditions-parameter.&lt;/P&gt;&lt;P&gt;Beside this the approach to create (several) temporary tables to avoid/minimize the need of various (sub) queries might increase the query-performance(s) quite heavily and is also reducing the risks of conflicting with any locking - and not mentioned yet any kind of time-outs. Qlik won't notice this unless the db+driver return an error, would they? Therefore also a check of the execution times and/or within the performance logs from Qlik/DB/OS may give some hints.&lt;/P&gt;&lt;P&gt;- Marcus&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Thu, 14 Jan 2021 08:47:21 GMT</pubDate>
      <guid>https://community.qlik.com/t5/QlikView/Occasionally-missing-records-from-import/m-p/1774291#M591178</guid>
      <dc:creator>marcus_sommer</dc:creator>
      <dc:date>2021-01-14T08:47:21Z</dc:date>
    </item>
    <item>
      <title>Re: Occasionally missing records from import</title>
      <link>https://community.qlik.com/t5/QlikView/Occasionally-missing-records-from-import/m-p/1774303#M591179</link>
      <description>&lt;P&gt;That sounds plausible.&amp;nbsp;&lt;BR /&gt;Planning is to&amp;nbsp; create temporary tables and store them in csv/or whatever format. And convert/import the temporary tables then into Qlik.&amp;nbsp; I can go a long way with your ideas and suggestions.&amp;nbsp;&lt;BR /&gt;Marcus, Thank you very much! &amp;nbsp;&lt;/P&gt;&lt;P&gt;Jan-Arie&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Thu, 14 Jan 2021 09:51:58 GMT</pubDate>
      <guid>https://community.qlik.com/t5/QlikView/Occasionally-missing-records-from-import/m-p/1774303#M591179</guid>
      <dc:creator>Jan-Arie</dc:creator>
      <dc:date>2021-01-14T09:51:58Z</dc:date>
    </item>
    <item>
      <title>Re: Occasionally missing records from import</title>
      <link>https://community.qlik.com/t5/QlikView/Occasionally-missing-records-from-import/m-p/1776714#M591180</link>
      <description>&lt;P&gt;I think I have found the failure of missing records.&lt;BR /&gt;&lt;BR /&gt;The queries read from the source MySQL table start at a point in the table as fetched in the latest run.&lt;BR /&gt;This is the 'vTagsHistoryID' pointer which is fetched at the latest point in the script and derived from the records in the final QVDfile (MAX function).&lt;BR /&gt;But during the running of the QVW script there are new records inserted by the customer systems. And to complete the script it takes roughly 5 till 10 minutes, so it is most likely that new records are inserted during this time.&lt;BR /&gt;&lt;BR /&gt;The first query fetch nicely from the starting pointer till the end of source table, but this last record is not the latest record for the next coming queries. The result set is for each successive query a little bit bigger.&lt;BR /&gt;And when the latest queries fetches some results and put into the final table the maximum ID is higher then was available during the first queries.&amp;nbsp;&lt;BR /&gt;The result is that in the next run the starting pointer is higher then the maximum id value of the first query and therefor records are skipped.&lt;BR /&gt;&lt;BR /&gt;The customer systems are not inserted at the same time interval, this can sometimes varies a few minutes, and therefor the difference where to start reading was very small (no missing records) to larger difference (greater change to miss records).&lt;BR /&gt;&lt;BR /&gt;This is what I have modified to solve it:&lt;BR /&gt;First determine the Maximum ID value in the source MySQL table before running any other query.&lt;BR /&gt;Modify the queries that is fetch only data in that range (e.g. " AND b.id &amp;gt; '$(vTagsHistoryID)' AND b.id &amp;lt;= '$(vMAXTagsHistoryID)' ").&lt;BR /&gt;And use/store that Maximum id as the starting point for the next run.&lt;BR /&gt;But don't use the maximum id from the derived QVD table!&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;- Jan-Arie&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Fri, 22 Jan 2021 12:56:52 GMT</pubDate>
      <guid>https://community.qlik.com/t5/QlikView/Occasionally-missing-records-from-import/m-p/1776714#M591180</guid>
      <dc:creator>Jan-Arie</dc:creator>
      <dc:date>2021-01-22T12:56:52Z</dc:date>
    </item>
  </channel>
</rss>

