Skip to main content
Announcements
A fresh, new look for the Data Integration & Quality forums and navigation! Read more about what's changed.
cancel
Showing results for 
Search instead for 
Did you mean: 
karandama2006
Creator

downloading JSON in structured format

Hi All, 

I'm working on web services and I need to download the JSON response using http get requests and then read those JSON files and parse them.

This works out well for smaller files , the issue here is the JSON document which I get is downloaded in a single row ! , so I have 200 mb of data downloaded in a single row .

This file is then impossible to open using an editor and then if I try to parse it in talend I get out of memory exception.

Is there any work around for this issue ? 

Does JSON response from https get request always have to be in Single row ? can we break it down and do some formatting before saving it on disk ?

What is the best way to parse a large JSON file size greater than 200 MB ?

 

 

Labels (4)
1 Solution

Accepted Solutions
Anonymous
Not applicable

I've looked at your file and the example you gave was not valid JSON. I suspect that is because of how you transferred it and cut it to here though. Before this code....

 

{ "ActionId":410768,

....you needed a comma. A small thing, but if your source does not provide that, it will not work.

 

Now, I changed the values of your file to make it easier for me to see whether it was working. I changed your file to this....

{
   "ExportSetGuid":"a7480ae9-5045-4e33-b6f5-875cd17e1711",
   "Actions":[
      {
         "ActionId":410766,
         "Allocations":[
            {
               "AllocationTargetOrgUnitId":82373,
               "FiscalYears":[
                  {
                     "FiscalYear":2017,
                     "PercentValue":1.0300
                  },
                  {
                     "FiscalYear":2018,
                     "PercentValue":2.0300
                  }

               ]
            }
         ]
      },
      { "ActionId":410768, 
         "Allocations":[ 
             { "AllocationTargetOrgUnitId":82377,
               "FiscalYears":
                [ 
                { "FiscalYear":2020, "PercentValue":3.0600 },
                { "FiscalYear":2021, "PercentValue":5.0900 }
                ]
             }
          ]
       }
   ]
}

I then built a job like this.....

0683p000009M2dJ.png

 

Ignore the deactivated tLogRows. I added those to see what I was getting after each component. This is a really good way of checking you are on the right track. The following screenshots show each of the components from left to right in order (ignoring the tLogRows....

0683p000009M2nX.png

First I extract the outermost loop information. The Actions data in the above screen shot is essentially pulling out a json snippet from the bigger json file and sending it forward to be processed.0683p000009M2nc.png

The above tExtractJsonFields component is passing through the ExportSetGuid value (hence it is left blank in the Json query) and is extracting the ActionId value and the Allocations array json snippet. Notice the Json Field value is set to Actions. Also notice the loop. It is imply looping over the array.

0683p000009M2ft.png

The above tExtractJsonFields component is passing through the ExportSetGuid and ActionId values and is extracting the AllocationTargetOrgUnitId value and the FiscalYears array json snippet. Notice the Json Field value is set to Allocations. Also notice the loop. It is imply looping over the array.

0683p000009M2Vy.png

The above tExtractJsonFields component is passing through the ExportSetGuid, ActionId and AllocationTargetOrgUnitId values and is extracting the FiscalYear and PercentValue values. This is the lowest leaf on the tree. Notice the Json Field value is set to FiscalYears. Also notice the loop. It is imply looping over the array.

 

The output from this was.....

Starting job ExampleFilteringtXML at 11:28 26/02/2019.

[statistics] connecting to socket on port 3677
[statistics] connected
a7480ae9-5045-4e33-b6f5-875cd17e1711|410766|82373|2017|1.03
a7480ae9-5045-4e33-b6f5-875cd17e1711|410766|82373|2018|2.03
a7480ae9-5045-4e33-b6f5-875cd17e1711|410768|82377|2020|3.06
a7480ae9-5045-4e33-b6f5-875cd17e1711|410768|82377|2021|5.09
[statistics] disconnected

Job ExampleFilteringtXML ended at 11:28 26/02/2019. [exit code=0]

This is not necessarily a quick and easy way to achieve this, but it is a methodical way to work. 

View solution in original post

12 Replies
Anonymous
Not applicable

The problem here is NOT the formatting. If anything the lack of formatting will save on space. The problem here is the size of data. You may want to find a text editor which handles large files better than the one you are using. I use UEStudio which would easily handle 200MB.

 

To read it in Talend, you may want to play with your job and Studio memory settings. 

This link will help you increase the memory for Studio (https://help.talend.com/reader/pd~gJPOP3y0tu8jedjNfIA/aNyC~EW3eVxmjwAUWgA~pw)

This link will help with increasing the memory for your job (https://community.talend.com/t5/Migration-Configuration-and/OutOfMemory-Exception/ta-p/21669)

 

vapukov
Master II

+1 to all from @rhall 

 

also good to check your API endpoint documentation, many API contain additional settings to reduce response size - split it to chunks

example:

  • receive all customers - is very huge JSON
  • receive 100 customers, with offset XXX, it is 100 customers, but many iterations

 

additional + for UltraEdit, it could edit docs of any size, personally, I use it for 15-20Gb files

karandama2006
Creator
Author

Thanks @rhall and @vapukov for your valuable suggestion.

I tried increasing memory of the talend studio and Job with below configurations

 

-vm
C:\Program Files (x86)\Talend-Studio\jre1.8.0_131\bin\server\jvm.dll
-vmargs
-Xms1024m
-Xmx40966m
-Dfile.encoding=UTF-8
-Dosgi.requiredJavaVersion=1.8
-XX:+UseG1GC
-XX:+UseStringDeduplication

0683p000009M2f5.png

 

Gave temporary location in tMap as well

 

0683p000009M2lv.png

 

But this did not help , now instead of getting out of memory exception , Talend is in a stuck state for 3 hours.
I'm working on a system which has 8GB RAM

 

My Job works  for other files which are less than or equal to 10mb and share a similar JSON structure to 208 mb file 

 

0683p000009M2m0.png

 

T-MAP is not complicated, it is one-to-one mapping with addition of timestamp.

 

Is there any other way we can get this to work ?

 

Thanks

Anonymous
Not applicable

I think I see your problem. You are using XPath instead of JsonPath. What happens when you do this is that the JSON is converted to XML in memory and then the XML is parsed. With a 200MB  json file, this will be a massive overhead. If you are working with json, you should try to use JsonPath. 

karandama2006
Creator
Author

@rhall  Thanks for your reply

 

I had tried using the other options before but I did not get expected output so I switched to Xpath.
I tried to parse the JSON again with below configuration with a smaller file, this config is similar to the one I used in Xpath

 

0683p000009M2mA.jpg

 

But I get an Error and Unexpected output

 

The Json resource datas maybe have some problems, please make sure the data structure with the same fields.
[WARN ]: integration_demo.j_08_transfer_transactional_tr_alloc_fy_0_1.j_08_transfer_transactional_tr_alloc_fy - tFileInputJSON_2 - The Json resource datas maybe have some problems, please make sure the data structure with the same fields.

Output file does not loop the outermost element of JSON , export id in this case .
In below screenshot we can see it prints exportid only once , but I need it in all Lines

 

0683p000009M2g8.jpg

 

This is small JSON Snippet

 

{
   "ExportSetGuid":"a7480ae9-5045-4e33-b6f5-875cd17e1711",
   "Actions":[
      {
         "ActionId":410766,
         "Allocations":[
            {
               "AllocationTargetOrgUnitId":82373,
               "FiscalYears":[
                  {
                     "FiscalYear":2019,
                     "PercentValue":3.0200
                  }
               ]
            }
         ]
      }
   ]
}

 

Anonymous
Not applicable

Why are you using JsonPath without loop? Surely you need the loops?

karandama2006
Creator
Author

JSON path with loops only returns the data which are inside a  list (This is what I've found)

 

I need data from the list as well as the elements which are outside .

Anonymous
Not applicable

Are you saying that your 200MB json file does not have any loops in it? If you are extracting looped data, you will need to use JsonPath and not JsonPath without loops.

karandama2006
Creator
Author

All my file have loops , but if I use JSON Path and then specify a loop , I am able to retrieve only specific data from a Single Loop.

In my JSON there are many loops and I want to be able to retrieve data from all the loops at once in a single line (Xpath allows me to do that as we can see in earlier screenshot)   

 

If below is the Json 

 

{
   "ExportSetGuid":"a7480ae9-5045-4e33-b6f5-875cd17e1711",
   "Actions":[
      {
         "ActionId":410766,
         "Allocations":[
            {
               "AllocationTargetOrgUnitId":82373,
               "FiscalYears":[
                  {
                     "FiscalYear":2019,
                     "PercentValue":3.0200
                  },
                  {
                     "FiscalYear":2019,
                     "PercentValue":5.0200
                  }

               ]
            }
         ]
      }
{ "ActionId":410768,
"Allocations":[
{ "AllocationTargetOrgUnitId":82373,
"FiscalYears":
[
{ "FiscalYear":2019, "PercentValue":3.0200 },
{ "FiscalYear":2019, "PercentValue":5.0200 }
]
}
]
} ] }

I need output as below

 

a7480ae9-5045-4e33-b6f5-875cd17e1711;410766;82373;2019;3.0200
a7480ae9-5045-4e33-b6f5-875cd17e1711;410766;82373;2019;5.0200

a7480ae9-5045-4e33-b6f5-875cd17e1711;410768;82373;2019;3.0200
a7480ae9-5045-4e33-b6f5-875cd17e1711;410768;82373;2019;5.0200

 

Tried it many times , unable achieve above result with Json path or Json path without loop , it works only with Xpath