Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 
aniruddhyadutta
Creator
Creator

Oracle timestamp conversion in QLIK

Hi All,

I  have been struggling to convert an oracle datetime filed which is a var char

for incremental load I need to compare between oracle and qlik.

For that reason I have converted the oracle varchar filed to time stamp

in oracle the date time filed is coming as : 201705300752435634343

after I convert to timestamp : 30-MAY-17 07.52.43.563434300 AM

now In qlik I think microsecond is only supported till 5th decimal place.

The formula I am using is :=(Timestamp(Timestamp#(DATETIME,'YYYYMMDDhhmmssfffffffff'),'DD-MMM-YY hh.mm.ss.fffffffff TT') )

which is giving the below result .The last  6 places are defaulted to zero and also month name  is coming in small

Capture.JPG

can anyone please  help me to convert the DATETIME field in qlik  exactly like the oracle time stamp (i.e 30-MAY-17 07.52.43.563434300 AM) so that I can do a comparison in my incremental load.

8 Replies
Anonymous
Not applicable

For incremental loads I usually start from a minute before the previous Qlik load started, as in get from the database all rows amended after that time, and if the key of any of the incremental rows clashes with any existing Qlik rows overwrite the Qlik row with the incremental database row.

This eliminates any worries about date precision / conversion such as you describe.

[It also eliminates some other worries you may not have started worrying about yet]

gsbeaton
Luminary Alumni
Luminary Alumni

Why don't you truncate the microseconds off completely - I doubt you need it to be that accurate.  You could just do something like LEFT(TIMESTAMP, 14) on your Oracle date varchar field and then compare that with a similar timestamp format (eg 'DD-MMM-YY hh.mm.ss').

aniruddhyadutta
Creator
Creator
Author

Bill,

Thanks for your reply.

your statutory warning  really scared me ..

My incremental load runs every 5 min and we are not keen on reducing the performance in any sort  if it is not at all mandatory.Can you please tell us a scenario where this offset of 1 minute can be effective?

aniruddhyadutta
Creator
Creator
Author

Hi George,

Thanks for reply.

If I truncate the millisecond part and in my incremental load comparison I use DATETIME >=history QVD max time

do you think any scenario where I can miss data?

The "equal to" part should take care of the  records which can be in the same time stamp after I truncate..if my understanding is correct.

Anonymous
Not applicable

A good question to which the answer is "it all depends...."

The scenario you mention is an incremental reload every 5 minutes, as opposed to say a daily one.  I presume wish to get your Qlik data as near real time as possible.

The crux is the Qlik last reload datetime and the various Oracle last updated timestamps.

You ask re a potential scenario where things can go adrift and one is where Qlik time is out of sync with Oracle times.  In the perfect world time is time and always the same but unfortunately the world is not perfect.  Whilst it is good practise to have an time server sync'd to an external NTP server that all you internal servers sync their time to thus ensuring all you server are running on the same time I have seen set ups where such time syncing has failed or even simply not been set up.  Qlik has used it server time and Oracle has used its time which has been adrift.

I have also seen a scenario was the Oracle timestamp field was populated from timestamps in incoming messaging with a time that was generated externally and messaging queues built up leading to a lag to when the database was actually updated with the timestamp as opposed to when the timestamp was generated externally.

Sometimes it can be best to populate the Qlik timestamp from the latest timestamp that Oracle returns.  Some people trawl the Qlik QVD to find the latest timestamp loaded from Oracle but this trawl, if the QVD is large, can oft take a fair time in its own right.

As "it all depends...." could you outline your process flow, here I make up an example:

  • Do an initial full load of all data from Oracle and create initial QVD
    • Store in another QVD the Qlik Server timestamp it started at
  • Do an incremental extract of the Oracle data updated after the time stored in the QVD
    • Read the initial QVD into Qlik
    • Concatenate on your incremental data from Oracle.

This is really basic scenario, not least because it won't take into account Oracle rows that have been deleted and will duplicate Oracle rows that have been amended.

I often find it useful to write the problem down first - have a look at this blog post by HIC, The Feynman Problem Solving Algorithm

gsbeaton
Luminary Alumni
Luminary Alumni

I don't see any scenario where your >= logic would miss data so I would think you are good to go with that logic.  Does your date stamp also pick up updated records?

Have a look at Rob Wunderlich's post on the fastest way to read a Max time from QVD for some good advice on how to get your comparison time from the QVD.

Good luck

George

aniruddhyadutta
Creator
Creator
Author

I really do apologies for such a late reply... but Thanks a zillion  for your Detailed reply Bill.It really removed some of my doubts. Thanks again


cheers

AD

aniruddhyadutta
Creator
Creator
Author

Hi George,

You know what I posted this question a month ago and  we didn't encounter  any issues using that >=logic as of now..

Thanks for taking out time to reply!!