Unlock a world of possibilities! Login now and discover the exclusive benefits awaiting you.
Hi,
I have a json as below. While performing metadata scanning, I am able to get name, id but data is a hashmap/dictionary with datetime as key and its corresponding value.
Can I please know how to parse the dictionary/hashmap in the below json using talend components ?
[
{
"name": "location",
"id": 20,
"data": {
"2018-09-06T00:00:00": 36.0,
"2018-09-06T00:10:00": 64.0,
"2018-09-06T00:20:00": 88.0,
"2018-09-06T00:30:00": 52.0,
"2018-09-06T00:40:00": 10.0,
"2018-09-06T00:50:00": 28.0,
"2018-09-06T01:00:00": 46.0,
"2018-09-06T01:10:00": 42.0,
"2018-09-06T01:20:00": 36.0,
"2018-09-06T01:30:00": 24.0,
"2018-09-06T01:40:00": 0.0,
"2018-09-06T01:50:00": 0.0,
"2018-09-06T02:00:00": -2.0,
"2018-09-06T02:10:00": -2.0,
"2018-09-06T02:20:00": -2.0,
"2018-09-06T02:30:00": 4.0,
"2018-09-06T02:40:00": 0.0,
"2018-09-06T02:50:00": 2.0,
"2018-09-06T03:00:00": 6.0,
"2018-09-06T03:10:00": 0.0,
"2018-09-06T03:20:00": 2.0,
"2018-09-06T03:30:00": 6.0,
"2018-09-06T03:40:00": 8.0,
"2018-09-06T03:50:00": 8.0,
"2018-09-06T04:00:00": 8.0,
"2018-09-06T04:10:00": 8.0,
"2018-09-06T04:20:00": 8.0,
"2018-09-06T04:30:00": 4.0,
"2018-09-06T04:40:00": 10.0,
"2018-09-06T04:50:00": 18.0,
"2018-09-06T05:00:00": 20.0,
"2018-09-06T05:10:00": 22.0,
"2018-09-06T05:20:00": 8.0,
"2018-09-06T05:30:00": 4.0,
"2018-09-06T05:40:00": 8.0,
"2018-09-06T05:50:00": 4.0,
"2018-09-06T06:00:00": 4.0,
"2018-09-06T06:10:00": 8.0,
"2018-09-06T06:20:00": 14.0,
"2018-09-06T06:30:00": 16.0,
"2018-09-06T06:40:00": 8.0,
"2018-09-06T06:50:00": 22.0,
"2018-09-06T07:00:00": 24.0,
"2018-09-06T07:10:00": 12.0,
"2018-09-06T07:20:00": 6.0,
"2018-09-06T07:30:00": 12.0,
"2018-09-06T07:40:00": 22.0,
"2018-09-06T07:50:00": 24.0,
"2018-09-06T08:00:00": 22.0,
"2018-09-06T08:10:00": 30.0,
"2018-09-06T08:20:00": 12.0,
"2018-09-06T08:30:00": 4.0,
"2018-09-06T08:40:00": 16.0,
"2018-09-06T08:50:00": 20.0,
"2018-09-06T09:00:00": 4.0,
"2018-09-06T09:10:00": 6.0,
"2018-09-06T09:20:00": 4.0,
"2018-09-06T09:30:00": 10.0,
"2018-09-06T09:40:00": 4.0,
"2018-09-06T09:50:00": 8.0,
"2018-09-06T10:00:00": 2.0,
"2018-09-06T10:10:00": 6.0,
"2018-09-06T10:20:00": 30.0,
"2018-09-06T10:30:00": 56.0,
"2018-09-06T10:40:00": 42.0,
"2018-09-06T10:50:00": 50.0,
"2018-09-06T11:00:00": 30.0,
"2018-09-06T11:10:00": 16.0,
"2018-09-06T11:20:00": 14.0,
"2018-09-06T11:30:00": 8.0,
"2018-09-06T11:40:00": 32.0,
"2018-09-06T11:50:00": 64.0,
"2018-09-06T12:00:00": 50.0,
"2018-09-06T12:10:00": 30.0,
"2018-09-06T12:20:00": 34.0,
"2018-09-06T12:30:00": 18.0,
"2018-09-06T12:40:00": 18.0,
"2018-09-06T12:50:00": 26.0,
"2018-09-06T13:00:00": 76.0,
"2018-09-06T13:10:00": 76.0,
"2018-09-06T13:20:00": 62.0,
"2018-09-06T13:30:00": 52.0,
"2018-09-06T13:40:00": 62.0,
"2018-09-06T13:50:00": 54.0,
"2018-09-06T14:00:00": 56.0,
"2018-09-06T14:10:00": 76.0,
"2018-09-06T14:20:00": 68.0,
"2018-09-06T14:30:00": 74.0,
"2018-09-06T14:40:00": 88.0,
"2018-09-06T14:50:00": 86.0,
"2018-09-06T15:00:00": 116.0,
"2018-09-06T15:10:00": 100.0,
"2018-09-06T15:20:00": 126.0,
"2018-09-06T15:30:00": 92.0,
"2018-09-06T15:40:00": 96.0,
"2018-09-06T15:50:00": 120.0,
"2018-09-06T16:00:00": 104.0,
"2018-09-06T16:10:00": 90.0,
"2018-09-06T16:20:00": 120.0,
"2018-09-06T16:30:00": 78.0,
"2018-09-06T16:40:00": 98.0,
"2018-09-06T16:50:00": 118.0,
"2018-09-06T17:00:00": 108.0,
"2018-09-06T17:10:00": 168.0,
"2018-09-06T17:20:00": 120.0,
"2018-09-06T17:30:00": 148.0,
"2018-09-06T17:40:00": 100.0,
"2018-09-06T17:50:00": 80.0,
"2018-09-06T18:00:00": 92.0,
"2018-09-06T18:10:00": 78.0,
"2018-09-06T18:20:00": 58.0,
"2018-09-06T18:30:00": 76.0,
"2018-09-06T18:40:00": 118.0,
"2018-09-06T18:50:00": 70.0,
"2018-09-06T19:00:00": 70.0,
"2018-09-06T19:10:00": 42.0,
"2018-09-06T19:20:00": 28.0,
"2018-09-06T19:30:00": 32.0,
"2018-09-06T19:40:00": 36.0,
"2018-09-06T19:50:00": 26.0,
"2018-09-06T20:00:00": 14.0,
"2018-09-06T20:10:00": 16.0,
"2018-09-06T20:20:00": 20.0,
"2018-09-06T20:30:00": 24.0,
"2018-09-06T20:40:00": 20.0,
"2018-09-06T20:50:00": 18.0,
"2018-09-06T21:00:00": 22.0,
"2018-09-06T21:10:00": 26.0,
"2018-09-06T21:20:00": 24.0,
"2018-09-06T21:30:00": 14.0,
"2018-09-06T21:40:00": 14.0,
"2018-09-06T21:50:00": 12.0,
"2018-09-06T22:00:00": 8.0,
"2018-09-06T22:10:00": 8.0,
"2018-09-06T22:20:00": 2.0,
"2018-09-06T22:30:00": 2.0,
"2018-09-06T22:40:00": 2.0,
"2018-09-06T22:50:00": 4.0,
"2018-09-06T23:00:00": 8.0,
"2018-09-06T23:10:00": 4.0,
"2018-09-06T23:20:00": 6.0,
"2018-09-06T23:30:00": 2.0,
"2018-09-06T23:40:00": 2.0
}
}
]
Here is how I would do this.
1) Store your HashMap in the globalMap...
globalMap.put("hashmap", yourHashMap);2) Start a new SubJob with a tJavaFlex. In the Start Code section, get the keyset of your HashMap and start an iteration over that keyset. For example...
java.util.HashMap<String, String> yourHashMap = ((java.util.HashMap<String, String>)globalMap.get("hashmap"));
java.util.Iterator<String> keysIt = yourHashMap.keySet().iterator();
while(keysIt.hasNext()){3) In your Main Code section, retrieve your records and output to the columns you should have manually configured. There will be a row for each loop of this section. This section is controlled by the while loop opened in the Start Code.
row1.myColumn = yourHashMap.get(keysIt.next());
4) Then in the End Code section, simply close the while loop and do any other tidy up you may wish to do...
}
I don't believe there is a way of doing this with Talend components.I *think* there was a Jira set up for this. However you can do this with a bit of Java and a tJavaFlex/tJavaRow. Take a look at the solution for this question. You will need to know Java to do this and if you have Java you should be able to extrapolate from this....
https://stackoverflow.com/questions/7304002/how-to-parse-a-dynamic-json-key-in-a-nested-json-result
I was able to process the json and get key value pairs using below code. Also please see attached screenshot of current job. I now want to pass the key value pairs processed in java_row component and pass them to next component like say (tSortRow) where I can sort the output based on date in ascending order and store datetime, value in CSV file.
Can I please know on how to pass the key value pairs as individual columns to next components ?
System.out.println(row1.Body.substring(1, row1.Body.length() - 1));
JSONObject jsonObject = new JSONObject(row1.Body.substring(1, row1.Body.length()));
JSONObject value = jsonObject.getJSONObject("data");
Iterator<?> keys = value.keys();
StringBuilder builder = new StringBuilder();
Date date = null;
DateFormat format = new SimpleDateFormat("yyyy-MM-dd'T'hh:mm:ss");
SimpleDateFormat sdfDestination = new SimpleDateFormat("dd-MMM-yyyy hh:mm:ss");
while(keys.hasNext() ) {
String key = (String)keys.next();
date = format.parse(key);
String destDate = sdfDestination.format(date);
double value_1 = value.getDouble(key);
System.out.println(destDate + "," + value_1);
}
Here is how I would do this.
1) Store your HashMap in the globalMap...
globalMap.put("hashmap", yourHashMap);2) Start a new SubJob with a tJavaFlex. In the Start Code section, get the keyset of your HashMap and start an iteration over that keyset. For example...
java.util.HashMap<String, String> yourHashMap = ((java.util.HashMap<String, String>)globalMap.get("hashmap"));
java.util.Iterator<String> keysIt = yourHashMap.keySet().iterator();
while(keysIt.hasNext()){3) In your Main Code section, retrieve your records and output to the columns you should have manually configured. There will be a row for each loop of this section. This section is controlled by the while loop opened in the Start Code.
row1.myColumn = yourHashMap.get(keysIt.next());
4) Then in the End Code section, simply close the while loop and do any other tidy up you may wish to do...
}
@rhall : Thank you so much, solution worked !!