Skip to main content
Woohoo! Qlik Community has won “Best in Class Community” in the 2024 Khoros Kudos awards!
Announcements
Nov. 20th, Qlik Insider - Lakehouses: Driving the Future of Data & AI - PICK A SESSION
cancel
Showing results for 
Search instead for 
Did you mean: 
Not applicable

Duplicate Data

No1:
SQL SELECT assetno,
    assetshortdesc,
    "org_date",
    "orig_priority",
    originator,
    "plan_priority",
    wo,
    "work_type"
FROM MainSaver11.dbo.wkaw;
store No1 into No1.qvd(qvd);
drop table No1;

Directory;
LOAD assetno,
     assetshortdesc,
     date (org_date) as org_date,
     orig_priority,
     upper (originator) as originator,
     plan_priority,
     wo,
     work_type
FROM
No1.qvd
(qvd)
WHERE(org_date >= '01/01/2013')
and left (wo,1)<>'P' and left (wo,1)<>'T';

No2:
SQL SELECT "login_id",
    wo
FROM MainSaver11.dbo.wkaws;
store No2 into No2.qvd(qvd);
drop table No2;

Directory;
LOAD login_id,
     wo
FROM
No2.qvd
(qvd);

Hi,

I am very new to Qlikview, so please be patient

I have loaded my data into Qlkiview (see above), but have just come up against a problem with duplication of records.

I know why this is, but I am unsure how to to go about fixing it.

I have loaded data from a call logging system and for every call logged a unique reference number is created.

When a call is logged the system logs the users LOGIN ID name details. Then for every time a user makes a change to the call (for example completes or closes the call) the system logs the users LOGIN ID name again.

Qlikview is currently showing me (for example) x3 of each unique reference number and a different LOGIN ID for each time a change was made.

How do I get Qlkiview to only show me the first initial record of the LOGIN ID on creation of the job, not the each individual change?

I hope this makes sense.

Please let me know if there is anything else I can attach to help you answer my question.

FYI - I have attached an image of both tables I am using.

Thanks

30 Replies
Not applicable
Author

Witch if the table store duplicates? No1 or No2?

Example of the data would be helpful.

Not applicable
Author

Hi Alexey,

Its the WO that is the duplicate (it sits in both table 1 & 2).

I have attached an example of the data. I hope this helps.

Not applicable
Author

while using date in the chart apply only(Distinct date_field)

It will not duplicate the dates.

ecahen
Partner - Contributor II
Partner - Contributor II

hi, can you send me your qvw file?

Not applicable
Author

Sure. How is the best way to send it to you?

Not applicable
Author

Hi Hannah,

Apologies, but how would I do that exactly?

Thank you for your reply.

Susan

ecahen
Partner - Contributor II
Partner - Contributor II

If the file is not to heavy, simply by uploading it. Other why you can empty some dimension before.

Thanks;

Edouard Cahen

Directeur Associé - <mailto:ecahen@excelcio.com> ecahen@excelcio.com

Description : Description : Signature Ted

Excelcio est une société spécialisée en pilotage de la performance <http://www.excelcio.com/> www.excelcio.com

De : Susan Horn

Envoyé : vendredi 11 octobre 2013 17:27

À : Edouard CAHEN

Objet : Re: - Duplicate Data

<http://community.qlik.com/> QlikCommunity

Duplicate Data

reply from Susan Horn <http://community.qlik.com/people/SusanHorn?et=watches.email.thread> in New to QlikView - View the full discussion <http://community.qlik.com/message/404926?et=watches.email.thread#404926>

Peter_Cammaert
Partner - Champion III
Partner - Champion III

Lift the unique identification of the oldest records by using the GROUP BY clause and the min() function, like this:

Entries:

LOAD UniqueReferenceNr, Min(DateOfRecord) as CreationDate FROM File.qvd GROUP BY UniqueReferenceNr;

Then JOIN the original table to Entries to add further details to the record identifiers you want to keep.

Sorry for the brevity. If you post your document in this discussion, we could provide you with a ready example.

Best,

Peter

Not applicable
Author

I am having problems attaching documents on the forum, so fingers crossed this has worked.

Thank you for your help and advice so far. Much appreciated.