<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Re: Store duplicated rows into another table in QlikView</title>
    <link>https://community.qlik.com/t5/QlikView/Store-duplicated-rows-into-another-table/m-p/1630496#M446423</link>
    <description>&lt;P&gt;Yeah, that´s the approach.&lt;/P&gt;&lt;P&gt;But I wanted to know if there were any "out of the box" solution, like a "reverse distinct" &lt;span class="lia-unicode-emoji" title=":slightly_smiling_face:"&gt;🙂&lt;/span&gt;&lt;/P&gt;&lt;P&gt;Seems it does not.&lt;/P&gt;&lt;P&gt;Thanks!&lt;/P&gt;</description>
    <pubDate>Wed, 02 Oct 2019 13:32:05 GMT</pubDate>
    <dc:creator>lfalmoguera</dc:creator>
    <dc:date>2019-10-02T13:32:05Z</dc:date>
    <item>
      <title>Store duplicated rows into another table</title>
      <link>https://community.qlik.com/t5/QlikView/Store-duplicated-rows-into-another-table/m-p/1630421#M446414</link>
      <description>&lt;P&gt;Hi all,&amp;nbsp;&lt;/P&gt;&lt;P&gt;I have a quick question in order to see if I can store duplicated values into a single file in order to deal with them afterwards.&lt;/P&gt;&lt;P&gt;Imagine I have this TABLE_EXAMPLE (my real table is much bigger with over +200 fields):&lt;/P&gt;&lt;TABLE border="0" cellspacing="0" cellpadding="0"&gt;&lt;TBODY&gt;&lt;TR&gt;&lt;TD&gt;field1&lt;/TD&gt;&lt;TD&gt;field2&lt;/TD&gt;&lt;TD&gt;field3&lt;/TD&gt;&lt;TD&gt;field4&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;1&lt;/TD&gt;&lt;TD&gt;1&lt;/TD&gt;&lt;TD&gt;1&lt;/TD&gt;&lt;TD&gt;1&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;1&lt;/TD&gt;&lt;TD&gt;1&lt;/TD&gt;&lt;TD&gt;1&lt;/TD&gt;&lt;TD&gt;1&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;2&lt;/TD&gt;&lt;TD&gt;2&lt;/TD&gt;&lt;TD&gt;2&lt;/TD&gt;&lt;TD&gt;2&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;3&lt;/TD&gt;&lt;TD&gt;3&lt;/TD&gt;&lt;TD&gt;3&lt;/TD&gt;&lt;TD&gt;3&lt;/TD&gt;&lt;/TR&gt;&lt;/TBODY&gt;&lt;/TABLE&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Then I script&lt;/P&gt;&lt;P&gt;Load * distinct Resident TABLE_EXAMPLE;&lt;/P&gt;&lt;P&gt;and will get&amp;nbsp;&lt;/P&gt;&lt;TABLE border="0" cellspacing="0" cellpadding="0"&gt;&lt;TBODY&gt;&lt;TR&gt;&lt;TD&gt;field1&lt;/TD&gt;&lt;TD&gt;field2&lt;/TD&gt;&lt;TD&gt;field3&lt;/TD&gt;&lt;TD&gt;field4&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;1&lt;/TD&gt;&lt;TD&gt;1&lt;/TD&gt;&lt;TD&gt;1&lt;/TD&gt;&lt;TD&gt;1&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;2&lt;/TD&gt;&lt;TD&gt;2&lt;/TD&gt;&lt;TD&gt;2&lt;/TD&gt;&lt;TD&gt;2&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;3&lt;/TD&gt;&lt;TD&gt;3&lt;/TD&gt;&lt;TD&gt;3&lt;/TD&gt;&lt;TD&gt;3&lt;/TD&gt;&lt;/TR&gt;&lt;/TBODY&gt;&lt;/TABLE&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Is there quick way I can get the "erased row" into another table, to get:&lt;/P&gt;&lt;P&gt;TABLE_DUPLICATED_ROW:&lt;/P&gt;&lt;TABLE border="0" cellspacing="0" cellpadding="0"&gt;&lt;TBODY&gt;&lt;TR&gt;&lt;TD&gt;field1&lt;/TD&gt;&lt;TD&gt;field2&lt;/TD&gt;&lt;TD&gt;field3&lt;/TD&gt;&lt;TD&gt;field4&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;1&lt;/TD&gt;&lt;TD&gt;1&lt;/TD&gt;&lt;TD&gt;1&lt;/TD&gt;&lt;TD&gt;1&lt;/TD&gt;&lt;/TR&gt;&lt;/TBODY&gt;&lt;/TABLE&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Like a&amp;nbsp;&lt;/P&gt;&lt;P&gt;load * "duplicated" Resident&amp;nbsp;&amp;nbsp;TABLE_EXAMPLE?&lt;/P&gt;&lt;P&gt;I want to avoid to "build" a 200 file-key.&lt;/P&gt;&lt;P&gt;Kind regards.&lt;/P&gt;</description>
      <pubDate>Sat, 16 Nov 2024 02:19:53 GMT</pubDate>
      <guid>https://community.qlik.com/t5/QlikView/Store-duplicated-rows-into-another-table/m-p/1630421#M446414</guid>
      <dc:creator>lfalmoguera</dc:creator>
      <dc:date>2024-11-16T02:19:53Z</dc:date>
    </item>
    <item>
      <title>Re: Store duplicated rows into another table</title>
      <link>https://community.qlik.com/t5/QlikView/Store-duplicated-rows-into-another-table/m-p/1630434#M446415</link>
      <description>&lt;P&gt;I think you need to add field and use Count(duplicatedfield) as count and group by some ID.&lt;/P&gt;&lt;P&gt;Then you can resident load table with condition Where(Count)&amp;gt;1&lt;/P&gt;</description>
      <pubDate>Wed, 02 Oct 2019 11:46:50 GMT</pubDate>
      <guid>https://community.qlik.com/t5/QlikView/Store-duplicated-rows-into-another-table/m-p/1630434#M446415</guid>
      <dc:creator>DavidM</dc:creator>
      <dc:date>2019-10-02T11:46:50Z</dc:date>
    </item>
    <item>
      <title>Re: Store duplicated rows into another table</title>
      <link>https://community.qlik.com/t5/QlikView/Store-duplicated-rows-into-another-table/m-p/1630447#M446417</link>
      <description>&lt;P&gt;Without the key might not be possible, but you can dynamically create the key field like this&lt;/P&gt;&lt;LI-CODE lang="markup"&gt;TABLE_EXAMPLE:
LOAD * INLINE [
    field1, field2, field3, field4
    1, 1, 1, 1
    1, 1, 1, 1
    2, 2, 2, 2
    3, 3, 3, 3
];

TempTable:
CrossTable (Field, Value)
LOAD 1 as RowNum,
	 *
Resident TABLE_EXAMPLE;

TempTable2:
LOAD Concat(DISTINCT '[' &amp;amp; Field &amp;amp; ']', '&amp;amp;') as ConcatField1,
	 Concat(DISTINCT '[' &amp;amp; Field &amp;amp; ']', ',') as ConcatField2
Resident TempTable;

LET vConcatField1 = Peek('ConcatField1');
TRACE $(vConcatField1);
LET vConcatField2 = Peek('ConcatField2');
TRACE $(vConcatField2);

DROP Tables TempTable, TempTable2;

Left Join (TABLE_EXAMPLE)
LOAD $(vConcatField2),
	 Count($(vConcatField)) as Count
Resident TABLE_EXAMPLE
Group By $(vConcatField2);

Table:
LOAD DISTINCT $(vConcatField2)
Resident TABLE_EXAMPLE;

Duplicate:
LOAD DISTINCT $(vConcatField2),
	 'Duplicate' as DuplicateFlag
Resident TABLE_EXAMPLE
Where Count &amp;gt; 1;

DROP Table TABLE_EXAMPLE;&lt;/LI-CODE&gt;</description>
      <pubDate>Wed, 02 Oct 2019 12:06:56 GMT</pubDate>
      <guid>https://community.qlik.com/t5/QlikView/Store-duplicated-rows-into-another-table/m-p/1630447#M446417</guid>
      <dc:creator>sunny_talwar</dc:creator>
      <dc:date>2019-10-02T12:06:56Z</dc:date>
    </item>
    <item>
      <title>Re: Store duplicated rows into another table</title>
      <link>https://community.qlik.com/t5/QlikView/Store-duplicated-rows-into-another-table/m-p/1630496#M446423</link>
      <description>&lt;P&gt;Yeah, that´s the approach.&lt;/P&gt;&lt;P&gt;But I wanted to know if there were any "out of the box" solution, like a "reverse distinct" &lt;span class="lia-unicode-emoji" title=":slightly_smiling_face:"&gt;🙂&lt;/span&gt;&lt;/P&gt;&lt;P&gt;Seems it does not.&lt;/P&gt;&lt;P&gt;Thanks!&lt;/P&gt;</description>
      <pubDate>Wed, 02 Oct 2019 13:32:05 GMT</pubDate>
      <guid>https://community.qlik.com/t5/QlikView/Store-duplicated-rows-into-another-table/m-p/1630496#M446423</guid>
      <dc:creator>lfalmoguera</dc:creator>
      <dc:date>2019-10-02T13:32:05Z</dc:date>
    </item>
    <item>
      <title>Re: Store duplicated rows into another table</title>
      <link>https://community.qlik.com/t5/QlikView/Store-duplicated-rows-into-another-table/m-p/1630509#M446425</link>
      <description>&lt;P&gt;Thanks a lot.&lt;/P&gt;&lt;P&gt;Just a slight ammend, just in case someone copy&amp;amp;paste your code.&lt;/P&gt;&lt;P&gt;There´s a missed 1 in line 3.&lt;/P&gt;&lt;LI-CODE lang="markup"&gt;Left Join (TABLE_EXAMPLE)
LOAD $(vConcatField2),
	 Count($(vConcatField1)) as Count
Resident TABLE_EXAMPLE
Group By $(vConcatField2);&lt;/LI-CODE&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;The code works fantastic and it´s automated, but with a +200 field per row and +10M file row, it makes the server go nuts : )&lt;/P&gt;&lt;P&gt;Anyhow, so much appreciated!&lt;/P&gt;&lt;P&gt;Any one with a "not su much process consuming " idea :' (&lt;/P&gt;</description>
      <pubDate>Wed, 02 Oct 2019 13:49:37 GMT</pubDate>
      <guid>https://community.qlik.com/t5/QlikView/Store-duplicated-rows-into-another-table/m-p/1630509#M446425</guid>
      <dc:creator>lfalmoguera</dc:creator>
      <dc:date>2019-10-02T13:49:37Z</dc:date>
    </item>
    <item>
      <title>Re: Store duplicated rows into another table</title>
      <link>https://community.qlik.com/t5/QlikView/Store-duplicated-rows-into-another-table/m-p/1630518#M446426</link>
      <description>&lt;P&gt;Yup, my bad for the typo....&lt;/P&gt;&lt;P&gt;I guess the number of fields might not be causing the issue, but the Group By is... we might be able to optimize this using Where Exist... let me play around with it.&lt;/P&gt;</description>
      <pubDate>Wed, 02 Oct 2019 13:56:12 GMT</pubDate>
      <guid>https://community.qlik.com/t5/QlikView/Store-duplicated-rows-into-another-table/m-p/1630518#M446426</guid>
      <dc:creator>sunny_talwar</dc:creator>
      <dc:date>2019-10-02T13:56:12Z</dc:date>
    </item>
    <item>
      <title>Re: Store duplicated rows into another table</title>
      <link>https://community.qlik.com/t5/QlikView/Store-duplicated-rows-into-another-table/m-p/1630541#M446430</link>
      <description>&lt;P&gt;Give this a shot&lt;/P&gt;&lt;LI-CODE lang="markup"&gt;TABLE_EXAMPLE:
LOAD * INLINE [
    field1, field2, field3, field4
    1, 1, 1, 1
    1, 1, 1, 1
    2, 2, 2, 2
    3, 3, 3, 3
];

TempTable:
CrossTable (Field, Value)
LOAD 1 as RowNum,
	 *
Resident TABLE_EXAMPLE
Where RecNo() = 1;

TempTable2:
LOAD Concat(DISTINCT '[' &amp;amp; Field &amp;amp; ']', '&amp;amp;') as ConcatField1,
	 Concat(DISTINCT '[' &amp;amp; Field &amp;amp; ']', ',') as ConcatField2
Resident TempTable;

LET vConcatField1 = Peek('ConcatField1');
TRACE $(vConcatField1);
LET vConcatField2 = Peek('ConcatField2');
TRACE $(vConcatField2);

DROP Tables TempTable, TempTable2;

TABLE_EXAMPLE2:
LOAD *,
	 If(Key = Previous(Key), RangeSum(Peek('KeyTemp'), 1), 1) as KeyTemp,
	 Hash128($(vConcatField1)&amp;amp;AutoNumber(If(Key = Previous(Key), RangeSum(Peek('KeyTemp'), 1), 1), $(vConcatField1))) as Key2;
LOAD *,
	 Hash128($(vConcatField1)) as Key	 
Resident TABLE_EXAMPLE
Order By $(vConcatField2);

DROP Table TABLE_EXAMPLE;
RENAME Table TABLE_EXAMPLE2 to TABLE_EXAMPLE;

Table:
LOAD DISTINCT $(vConcatField2),
	 Hash128($(vConcatField1)&amp;amp;1) as Key3
Resident TABLE_EXAMPLE;

Duplicate:
LOAD DISTINCT $(vConcatField2),
	 'Duplicate' as DuplicateFlag
Resident TABLE_EXAMPLE
Where not Exists(Key3, Key2);

DROP Table TABLE_EXAMPLE;
DROP Field Key3;&lt;/LI-CODE&gt;</description>
      <pubDate>Wed, 02 Oct 2019 14:27:27 GMT</pubDate>
      <guid>https://community.qlik.com/t5/QlikView/Store-duplicated-rows-into-another-table/m-p/1630541#M446430</guid>
      <dc:creator>sunny_talwar</dc:creator>
      <dc:date>2019-10-02T14:27:27Z</dc:date>
    </item>
    <item>
      <title>Re: Store duplicated rows into another table</title>
      <link>https://community.qlik.com/t5/QlikView/Store-duplicated-rows-into-another-table/m-p/1633180#M446628</link>
      <description>&lt;P&gt;Luis, did Sunny's last post and example get you a good working solution?&amp;nbsp; If so, please be sure to come back and use the Accept as Solution button on that post to let others know that it worked and to give Sunny credit for the help.&amp;nbsp; &amp;nbsp;If you have further questions, leave an update.&lt;/P&gt;
&lt;P&gt;Regards,&lt;BR /&gt;Brett&lt;/P&gt;</description>
      <pubDate>Wed, 09 Oct 2019 16:47:46 GMT</pubDate>
      <guid>https://community.qlik.com/t5/QlikView/Store-duplicated-rows-into-another-table/m-p/1633180#M446628</guid>
      <dc:creator>Brett_Bleess</dc:creator>
      <dc:date>2019-10-09T16:47:46Z</dc:date>
    </item>
    <item>
      <title>Re: Store duplicated rows into another table</title>
      <link>https://community.qlik.com/t5/QlikView/Store-duplicated-rows-into-another-table/m-p/1676033#M449771</link>
      <description>&lt;P&gt;Sorry Brett. My bad,&lt;/P&gt;&lt;P&gt;Solution accepted!&lt;/P&gt;</description>
      <pubDate>Sun, 16 Feb 2020 10:38:43 GMT</pubDate>
      <guid>https://community.qlik.com/t5/QlikView/Store-duplicated-rows-into-another-table/m-p/1676033#M449771</guid>
      <dc:creator>lfalmoguera</dc:creator>
      <dc:date>2020-02-16T10:38:43Z</dc:date>
    </item>
  </channel>
</rss>

