Yes, absolutely it can (having just done it) but depending on your data structure it might be best not to as it can still involve a lot of editing.
If your JSON data is in a file you should be able to just load it as any other flat table (csv, etc.) but you might have to split it out into sub-tables by hand.
Worst case is probably having to load the JSON into something like MongoDB then load into Qlik from there (that's where I started!!!)
I think that will only connect to web sources (http / https) and the poster is asking about reading from a file.
(I may have also mislead the original poster - I was loading from a QVD, so the JSON transformation had already been done by the REST connector).
Have a look here, which seems like it might help - loop through each row in your file and import the fields one at a time (probably only works if the JSON is un-nested).
Alternatively, just change the filename to *.csv and go to the file-connector and see what happens.
And, another option is to import into Excel then save the table from there and reload into Qlik. This will work as a one-off, not so sure it's a long-term solution.
(My preferred option would be something like MongoDB though - all the work is done automagically then).
You can read json files directly from a file directory.
In QlikView this option was available from table load. In Sense it dissapeared.
The script still works though ;-)
Here's an example
from [lib://YSIS/barthelscore.json] (json, codepage is 1252, embedded labels);
- If you dont know the column names first perform a select*
- Create a library/connection to your directory