Qlik Community

QlikView App Dev

Discussion Board for collaboration related to QlikView App Development.

Announcements
IMPORTANT security patches for GeoAnalytics Server available to download: READ DETAILS
cancel
Showing results for 
Search instead for 
Did you mean: 
mp802377
Creator
Creator

Concatenating 3 fields (Date, Hour, Minute) while using Peek for incremental Loads.

Three main questions.

  • How do I concatenate the fields from the table below: Date Hour Minute? I have only seen date and time, not where hour and minute were separated like they are.
  • When you do incremental loads, how do you prevent massive QVDs? If I pulled into a 45 day file, March 10th would drop off? I read Qlikview cannot delete rows.  
  • If I concatenate the Date/Hour/Minute field to 04/20/2022 04:15 and I run it at 4:15, then next run an hour later, will peek know to pull data after the 4:15 time? In doing the incremental load using question #1, the next run an hour later, will it know to pull at 4:16? I have only used peak to peak at the previous file to know what day to run next. Can it do the full time? 

 

I have a dataset. Very simple. There is no identifier (names, ids). I access data that has no personal information in it.  I created a field that concatenates all of the fields so it can look for duplication (as there are no identifiers). This is a massive strain on the system as this needs this updated multiple times a day.  (see question #3)

 

Explanation of data:

See table 1. My end goal is to create charts and something similar to table 2. The table contains about 30 million lines for each day. Because of how large the dataset is, I created a Qlikview app that pulled in each day of data and saved it as a QVD. Then I created another application that looks for how many days of data (in a config file), then pulls in that many numbers of days into one large QVD file.

For this simple set of data, I created multiple tables to get to Table 2. I table to identify if the row has been duplicated (there are some that have 134 duplicates). Each row I count as one duplicate. Another table to identify the complete start and end start and end of the entire trip. Another to find out failed. One part of the journey could have been a success, but another part may not have made it. 

 

Date

Hour

Minute

Country

Office_Loc

Type

Parts

PartID

PartNumber

Fail

TrackBegin

TrackEnd

44536

1

12

US

2J56Y

A

3

34513543

1

0

ATLANTA

SAN DIEGO

44536

1

12

US

2J56Y

A

3

34513543

2

0

SAN DIEGO

SAN FRANCISCO

44536

1

12

US

2J56Y

A

3

34513543

3

0

SAN FRANCISCO

TOKYO

44656

1

45

US

J7K5U

A

1

15531542

1

0

DENVER

SAN DIEGO

44656

1

45

US

J7K5U

A

1

15531542

1

0

DENVER

SAN DIEGO

44656

1

45

US

J7K5U

A

1

15531542

1

0

DENVER

SAN DIEGO

44657

1

39

US

D24J2

A

2

24584734

1

0

SEATTLE

DENVER

44657

1

39

US

D24J2

A

2

24584734

2

0

DENVER

ATLANTA

44669

2

5

US

E2N4G

A

3

213431552

1

0

SAN DIEGO

ATLANTA

44669

2

5

US

E2N4G

A

3

213431552

2

0

ATLANTA

NEW YORK

44669

2

5

US

E2N4G

A

3

213431552

3

0

NEW YORK

LONDON

44677

2

24

CA

B35JDJ

R

1

54313158

1

1

SEATTLE

DENVER

44677

2

24

CA

B35JDJ

R

1

54313158

1

1

SEATTLE

DENVER

44655

2

46

CA

GX8H3

R

2

243543345

1

0

TAMPA

DENVER

44655

2

46

CA

GX8H3

R

2

243543345

2

0

DENVER

SEATTLE

 

Table 2: 

Date

Hour

Minute

Country

Duplicates

Office_Loc

Type

Parts

PartID

PartNumber

Fail

TrackBegin

TrackEnd

JourneyStart

JourneyEnd

44536

1

12

US

1

2J56Y

A

3

34513543

1

0

ATLANTA

SAN DIEGO

ATLANTA

TOKYO

44536

1

12

US

1

2J56Y

A

3

34513543

2

0

SAN DIEGO

SAN FRANCISCO

ATLANTA

TOKYO

44536

1

12

US

1

2J56Y

A

3

34513543

3

0

SAN FRANCISCO

TOKYO

ATLANTA

TOKYO

44656

1

45

US

3

J7K5U

A

1

15531542

1

0

DENVER

SAN DIEGO

DENVER

SAN DIEGO

44657

1

39

US

1

D24J2

A

2

24584734

1

0

SEATTLE

DENVER

SEATTLE

ATLANTA

44657

1

39

US

1

D24J2

A

2

24584734

2

0

DENVER

ATLANTA

SEATTLE

ATLANTA

44669

2

5

US

1

E2N4G

A

3

213431552

1

0

SAN DIEGO

ATLANTA

SAN DIEGO

LONDON

44669

2

5

US

1

E2N4G

A

3

213431552

2

0

ATLANTA

NEW YORK

SAN DIEGO

LONDON

44669

2

5

US

1

E2N4G

A

3

213431552

3

0

NEW YORK

LONDON

SAN DIEGO

LONDON

44677

2

24

CA

2

B35JDJ

R

1

54313158

1

1

SEATTLE

DENVER

SEATTLE

DENVER

44655

2

46

CA

1

GX8H3

R

2

243543345

1

0

TAMPA

DENVER

TAMPA

SEATTLE

44655

2

46

CA

1

GX8H3

R

2

243543345

2

0

DENVER

SEATTLE

TAMPA

SEATTLE

 

Labels (2)
1 Solution

Accepted Solutions
marcus_sommer
MVP & Luminary
MVP & Luminary

By 30 M records per day is creating a daily qvd surely a good idea for the historical data. For the current data you may consider of creating hourly qvd's. If there are always just new records it's not bad to have no unique ID else using the timestamp should work quite well. You could create it with:

timestamp(Date + (Hour / 24) + (Minute / 24 / 60)) as Timestamp

and afterwards you may use something like:

load *, timestamp(Date + (Hour / 24) + (Minute / 24 / 60)) as Timestamp;
select * from database;
concatenate
load * from hourly_*.qvd (qvd) where not exists(Timestamp);

Another approach to this type of incremental loading could be to fetch the latest Timestamp from the loaded data and store it within a variable like:

t: load * from hourly_*.qvd (qvd);
concatenate
load *, timestamp(Date + (Hour / 24) + (Minute / 24 / 60)) as Timestamp;
select * from database
where timestamp(Date + (Hour / 24) + (Minute / 24 / 60)) > '$(varTimestamp)';

let varTimestamp = peek('Timestamp', -1, 't'); // if the data are timely ordered

or

max: load max(fieldvalue('Timestamp', recno()) as max autogenerate fieldvaluecount('Timestamp');

let varTimestamp = peek('max', o, 'max');

By the first load you may not have already a valid variable-value but this could bet set manually (of course more complex and generic solutions are thinkable). Further the showed timestamp-generation might need some adjustments to fit to your data-base because it are Qlik logic/functions - maybe any cast() ...

That's just for the qvd's. For the dashboard you may apply a binary load which contains the last 44 days and to it you could concatenate the hourly qvd's and within a overnight job you creates the daily qvd and updates the binary on this state.

- Marcus

 

View solution in original post

1 Reply
marcus_sommer
MVP & Luminary
MVP & Luminary

By 30 M records per day is creating a daily qvd surely a good idea for the historical data. For the current data you may consider of creating hourly qvd's. If there are always just new records it's not bad to have no unique ID else using the timestamp should work quite well. You could create it with:

timestamp(Date + (Hour / 24) + (Minute / 24 / 60)) as Timestamp

and afterwards you may use something like:

load *, timestamp(Date + (Hour / 24) + (Minute / 24 / 60)) as Timestamp;
select * from database;
concatenate
load * from hourly_*.qvd (qvd) where not exists(Timestamp);

Another approach to this type of incremental loading could be to fetch the latest Timestamp from the loaded data and store it within a variable like:

t: load * from hourly_*.qvd (qvd);
concatenate
load *, timestamp(Date + (Hour / 24) + (Minute / 24 / 60)) as Timestamp;
select * from database
where timestamp(Date + (Hour / 24) + (Minute / 24 / 60)) > '$(varTimestamp)';

let varTimestamp = peek('Timestamp', -1, 't'); // if the data are timely ordered

or

max: load max(fieldvalue('Timestamp', recno()) as max autogenerate fieldvaluecount('Timestamp');

let varTimestamp = peek('max', o, 'max');

By the first load you may not have already a valid variable-value but this could bet set manually (of course more complex and generic solutions are thinkable). Further the showed timestamp-generation might need some adjustments to fit to your data-base because it are Qlik logic/functions - maybe any cast() ...

That's just for the qvd's. For the dashboard you may apply a binary load which contains the last 44 days and to it you could concatenate the hourly qvd's and within a overnight job you creates the daily qvd and updates the binary on this state.

- Marcus