Unlock a world of possibilities! Login now and discover the exclusive benefits awaiting you.
Hello to all,
can anyone help me in this?
I've to evaluate if a Timestamp interval is less than 5 seconds (for example) between 2 records, using Qlik Sense Script.
How can I do this better?
I've tried with this statement:
if(interval((TimeString - (Previous(TimeString))), 'hh:mm:ss') < time('hh:mm:ss','00:00:05'), 'alert','')
as FlagAlert_5sec
This is the Timestamp Format: "04.11.2017 00:15:00" (TimeString)
Thank you for your help!
May be this
If(TimeString - Previous(TimeString) < MakeTime(0, 0, 5), 'alert','') as FlagAlert_5sec
May be this
If(TimeString - Previous(TimeString) < MakeTime(0, 0, 5), 'alert','') as FlagAlert_5sec
if( (TimeString - Previous(TimeString)) < interval#('5','s'), 'alert', '') as FlagAlert_5sec
Thank you
unfortunately none of your solutions work, It's not so simple maybe
from my test it seems the 2 solutions work; this is a small test script, result in image
tmp:
load
timestamp(today() + rand()) as TimeString
AutoGenerate 10000;
tmp2:
load
TimeString,
interval((TimeString - (Previous(TimeString))), 'hh:mm:ss'),
if( (TimeString - Previous(TimeString)) < interval#('5','s'), 'alert', '') as FlagAlert_5sec,
If(TimeString - Previous(TimeString) < MakeTime(0, 0, 5), 'alert','') as FlagAlert_5sec_Sunny
Resident tmp
order by TimeString;
drop Table tmp;
could you posr some rows of your data?
How about this
If(TimeStamp#(TimeString, 'DD.MM.YYYY hh:mm:ss') - Previous(TimeStamp#(TimeString, 'DD.MM.YYYY hh:mm:ss')) < MakeTime(0, 0, 5), 'alert','') as FlagAlert_5sec
In fact you're right,
the problem could be the aggregations we have to make before to evaluate the delay in seconds for each group of timestamps.
The example below shows a data subset in which we have a delay of 1 second between "04.11.2017 07:00:00" and "04.11.2017 07:00:01", for all the aggregations ("DB_CONTABILIZZATORI_2_ET20-ENERGIA_TERMICA_ASSOLUTA_IN_VAPORE_USCITA_CALDAIA" is one of them).
We are transforming a set of csv files, extracted from Energy Machines PLCs every 15 minutes, containing the misure of many dimensions (the aggregations); sometimes the csv files are extracted twice for the same energy check with a delay within 5 seconds, and we have to monitorize this.
============
"DB_CONTABILIZZATORI_2_ET20-ENERGIA_TERMICA_ASSOLUTA_IN_VAPORE_USCITA_CALDAIA-ENERGIA_TERMICA_COUNTERS_GRAND_TOTAL";"04.11.2017 07:00:00";72656000;1;43043291672,1065
"DB_CONTABILIZZATORI_2_ET20-ENERGIA_TERMICA_ASSOLUTA_IN_VAPORE_USCITA_CALDAIA-QUANTITA_VAPORE_COUNTERS_GRAND_TOTAL";"04.11.2017 07:00:00";79490600;1;43043291672,1181
"DB_CONTABILIZZATORI_3_ET21- ENERGIA_TERMICA_VAPORE_INGRESSO_TURBINA_COUNTERS_GRAND_TOTAL";"04.11.2017 07:00:00";11296136;1;43043291672,1181
"DB_CONTABILIZZATORI_3_ET21- PORTATA_VAPORE_VAPORE_INGRESSO_TURBINA_COUNTERS_GRAND_TOTAL";"04.11.2017 07:00:00";12458552;1;43043291672,1181
"DB_CONTABILIZZATORI_3_ET22- ENERGIA_TERMICA_VAPORE_INGRESSO_DEGASATORE_COUNTERS_GRAND_TOTAL";"04.11.2017 07:00:00";419425;1;43043291672,1181
"DB_CONTABILIZZATORI_3_ET22- PORTATA_VAPORE_INGRESSO_DEGASATORE_COUNTERS_GRAND_TOTAL";"04.11.2017 07:00:00";514255;1;43043291672,1181
"DB_CONTABILIZZATORI_2_ET23- ENERGIA_TERMICA_ACQUA_ALIMENTO_CALDAIA-ENERGIA_TERMICA_COUNTERS_GRAND_TOTAL";"04.11.2017 07:00:00";11542700;1;43043291672,1181
"DB_CONTABILIZZATORI_2_ET23- ENERGIA_TERMICA_ACQUA_ALIMENTO_CALDAIA-QUANTITA_ACQUA_COUNTERS_GRAND_TOTAL";"04.11.2017 07:00:00";85854;1;43043291672,1181
"DB_CONTABILIZZATORI_3_ET24- ENERGIA_TERMICA_ACQUA_PRERISCALDO_DEGASATORE_COUNTERS_GRAND_TOTAL";"04.11.2017 07:00:00";4213;1;43043291672,1759
"DB_CONTABILIZZATORI_3_ET25- ENERGIA_TERMICA_ACQUA_ATTEMPERAMENTO_VAPORE_COUNTERS_GRAND_TOTAL";"04.11.2017 07:00:01";3910;1;43043291672,662
"DB_CONTABILIZZATORI_ENERGIA_ATTIVA_PRELEVATA_SCAMBIO.COUNTERS.GRAND_TOTAL";"04.11.2017 07:00:01";328958;1;43043291672,6736
"DB_CONTABILIZZATORI_PORTATA_BIOMASSA.COUNTERS.GRAND_TOTAL";"04.11.2017 07:00:01";406952;1;43043291672,6736
"DB_CONTABILIZZATORI_PORTATA_CONDENSE.COUNTERS.GRAND_TOTAL";"04.11.2017 07:00:01";18145;1;43043291672,6852
"DB_CONTABILIZZATORI_PORTATA_DRYCOOLER.COUNTERS.GRAND_TOTAL";"04.11.2017 07:00:01";931959;1;43043291672,6968
"DB_CONTABILIZZATORI_PORTATA_RECUPERO.COUNTERS.GRAND_TOTAL";"04.11.2017 07:00:01";2438657;1;43043291672,7083
"DB_CONTABILIZZATORI_PORTATA_VAPORE.COUNTERS.GRAND_TOTAL";"04.11.2017 07:00:01";21620620;1;43043291672,7199
"DB_CONTABILIZZATORI_ENERGIA_ATTIVA_IMMESSA.COUNTERS.GRAND_TOTAL";"04.11.2017 07:15:01";7921425;1;43043302089,8727
"DB_CONTABILIZZATORI_ENERGIA_ATTIVA_IMMESSA_SCAMBIO.COUNTERS.GRAND_TOTAL";"04.11.2017 07:15:01";6429825;1;43043302089,8843
"DB_CONTABILIZZATORI_ENERGIA_ATTIVA_PRELEVATA.COUNTERS.GRAND_TOTAL";"04.11.2017 07:15:01";10;1;43043302089,8843
How do you determine your group?
For example, ""DB_CONTABILIZZATORI_ENERGIA_ATTIVA_IMMESSA.COUNTERS.GRAND_TOTAL" is a group, one of aggregations
I'm thinking to solve this using SUM(AGGR(IF in a front-end formula instead to modify script