Do not input private or sensitive data. View Qlik Privacy & Cookie Policy.
Skip to main content

Announcements
Qlik Open Lakehouse is Now Generally Available! Discover the key highlights and partner resources here.
cancel
Showing results for 
Search instead for 
Did you mean: 
Anonymous
Not applicable

parse json containing a dictionary

Hello,

 

I have a json as below. I have performed metadata scanning and was able to fetch name, id but data is a hash map. The dates change every day. I need columns to be parsed as below:

 

name, id, <date1>,<date2>...<date n>

location,20,36.0,64.0...

 

Can some one please help ?

 

 

[
  {
    "name": "location",
    "id": 20,
    "data": {
      "2018-09-06T00:00:00": 36.0,
      "2018-09-06T00:10:00": 64.0,
      "2018-09-06T00:20:00": 88.0,
      "2018-09-06T00:30:00": 52.0,
      "2018-09-06T00:40:00": 10.0,
      "2018-09-06T00:50:00": 28.0,
      "2018-09-06T01:00:00": 46.0,
      "2018-09-06T01:10:00": 42.0,
      "2018-09-06T01:20:00": 36.0,
      "2018-09-06T01:30:00": 24.0,
      "2018-09-06T01:40:00": 0.0,
      "2018-09-06T01:50:00": 0.0,
      "2018-09-06T02:00:00": -2.0,
      "2018-09-06T02:10:00": -2.0,
      "2018-09-06T02:20:00": -2.0,
      "2018-09-06T02:30:00": 4.0,
      "2018-09-06T02:40:00": 0.0,
      "2018-09-06T02:50:00": 2.0,
      "2018-09-06T03:00:00": 6.0,
      "2018-09-06T03:10:00": 0.0,
      "2018-09-06T03:20:00": 2.0,
      "2018-09-06T03:30:00": 6.0,
      "2018-09-06T03:40:00": 8.0,
      "2018-09-06T03:50:00": 8.0,
      "2018-09-06T04:00:00": 8.0,
      "2018-09-06T04:10:00": 8.0,
      "2018-09-06T04:20:00": 8.0,
      "2018-09-06T04:30:00": 4.0,
      "2018-09-06T04:40:00": 10.0,
      "2018-09-06T04:50:00": 18.0,
      "2018-09-06T05:00:00": 20.0,
      "2018-09-06T05:10:00": 22.0,
      "2018-09-06T05:20:00": 8.0,
      "2018-09-06T05:30:00": 4.0,
      "2018-09-06T05:40:00": 8.0,
      "2018-09-06T05:50:00": 4.0,
      "2018-09-06T06:00:00": 4.0,
      "2018-09-06T06:10:00": 8.0,
      "2018-09-06T06:20:00": 14.0,
      "2018-09-06T06:30:00": 16.0,
      "2018-09-06T06:40:00": 8.0,
      "2018-09-06T06:50:00": 22.0,
      "2018-09-06T07:00:00": 24.0,
      "2018-09-06T07:10:00": 12.0,
      "2018-09-06T07:20:00": 6.0,
      "2018-09-06T07:30:00": 12.0,
      "2018-09-06T07:40:00": 22.0,
      "2018-09-06T07:50:00": 24.0,
      "2018-09-06T08:00:00": 22.0,
      "2018-09-06T08:10:00": 30.0,
      "2018-09-06T08:20:00": 12.0,
      "2018-09-06T08:30:00": 4.0,
      "2018-09-06T08:40:00": 16.0,
      "2018-09-06T08:50:00": 20.0,
      "2018-09-06T09:00:00": 4.0,
      "2018-09-06T09:10:00": 6.0,
      "2018-09-06T09:20:00": 4.0,
      "2018-09-06T09:30:00": 10.0,
      "2018-09-06T09:40:00": 4.0,
      "2018-09-06T09:50:00": 8.0,
      "2018-09-06T10:00:00": 2.0,
      "2018-09-06T10:10:00": 6.0,
      "2018-09-06T10:20:00": 30.0,
      "2018-09-06T10:30:00": 56.0,
      "2018-09-06T10:40:00": 42.0,
      "2018-09-06T10:50:00": 50.0,
      "2018-09-06T11:00:00": 30.0,
      "2018-09-06T11:10:00": 16.0,
      "2018-09-06T11:20:00": 14.0,
      "2018-09-06T11:30:00": 8.0,
      "2018-09-06T11:40:00": 32.0,
      "2018-09-06T11:50:00": 64.0,
      "2018-09-06T12:00:00": 50.0,
      "2018-09-06T12:10:00": 30.0,
      "2018-09-06T12:20:00": 34.0,
      "2018-09-06T12:30:00": 18.0,
      "2018-09-06T12:40:00": 18.0,
      "2018-09-06T12:50:00": 26.0,
      "2018-09-06T13:00:00": 76.0,
      "2018-09-06T13:10:00": 76.0,
      "2018-09-06T13:20:00": 62.0,
      "2018-09-06T13:30:00": 52.0,
      "2018-09-06T13:40:00": 62.0,
      "2018-09-06T13:50:00": 54.0,
      "2018-09-06T14:00:00": 56.0,
      "2018-09-06T14:10:00": 76.0,
      "2018-09-06T14:20:00": 68.0,
      "2018-09-06T14:30:00": 74.0,
      "2018-09-06T14:40:00": 88.0,
      "2018-09-06T14:50:00": 86.0,
      "2018-09-06T15:00:00": 116.0,
      "2018-09-06T15:10:00": 100.0,
      "2018-09-06T15:20:00": 126.0,
      "2018-09-06T15:30:00": 92.0,
      "2018-09-06T15:40:00": 96.0,
      "2018-09-06T15:50:00": 120.0,
      "2018-09-06T16:00:00": 104.0,
      "2018-09-06T16:10:00": 90.0,
      "2018-09-06T16:20:00": 120.0,
      "2018-09-06T16:30:00": 78.0,
      "2018-09-06T16:40:00": 98.0,
      "2018-09-06T16:50:00": 118.0,
      "2018-09-06T17:00:00": 108.0,
      "2018-09-06T17:10:00": 168.0,
      "2018-09-06T17:20:00": 120.0,
      "2018-09-06T17:30:00": 148.0,
      "2018-09-06T17:40:00": 100.0,
      "2018-09-06T17:50:00": 80.0,
      "2018-09-06T18:00:00": 92.0,
      "2018-09-06T18:10:00": 78.0,
      "2018-09-06T18:20:00": 58.0,
      "2018-09-06T18:30:00": 76.0,
      "2018-09-06T18:40:00": 118.0,
      "2018-09-06T18:50:00": 70.0,
      "2018-09-06T19:00:00": 70.0,
      "2018-09-06T19:10:00": 42.0,
      "2018-09-06T19:20:00": 28.0,
      "2018-09-06T19:30:00": 32.0,
      "2018-09-06T19:40:00": 36.0,
      "2018-09-06T19:50:00": 26.0,
      "2018-09-06T20:00:00": 14.0,
      "2018-09-06T20:10:00": 16.0,
      "2018-09-06T20:20:00": 20.0,
      "2018-09-06T20:30:00": 24.0,
      "2018-09-06T20:40:00": 20.0,
      "2018-09-06T20:50:00": 18.0,
      "2018-09-06T21:00:00": 22.0,
      "2018-09-06T21:10:00": 26.0,
      "2018-09-06T21:20:00": 24.0,
      "2018-09-06T21:30:00": 14.0,
      "2018-09-06T21:40:00": 14.0,
      "2018-09-06T21:50:00": 12.0,
      "2018-09-06T22:00:00": 8.0,
      "2018-09-06T22:10:00": 8.0,
      "2018-09-06T22:20:00": 2.0,
      "2018-09-06T22:30:00": 2.0,
      "2018-09-06T22:40:00": 2.0,
      "2018-09-06T22:50:00": 4.0,
      "2018-09-06T23:00:00": 8.0,
      "2018-09-06T23:10:00": 4.0,
      "2018-09-06T23:20:00": 6.0,
      "2018-09-06T23:30:00": 2.0,
      "2018-09-06T23:40:00": 2.0
    }
  }
]

 

Labels (3)
1 Reply
Jesperrekuh
Specialist
Specialist

@arunshankar , keep it in the same topic please !?! 

 

https://community.talend.com/t5/Design-and-Development/Parse-json-containing-hashmap/m-p/133144

and

https://community.talend.com/t5/Design-and-Development/Parse-json-containing-date/m-p/133176

 

As I've answered in the previous / linked topic... tJavaFlex... did you try? All of the talend JSON components need to map the 'key' to a column... 

 

You could also write a custom routine where you load the "data" object from JSON and return a hashMap, arraylist or whatever you want. 
If anybody has a different solution for this please let us know... because I also would like a component solution.