Do not input private or sensitive data. View Qlik Privacy & Cookie Policy.
Skip to main content

Announcements
Qlik Open Lakehouse is Now Generally Available! Discover the key highlights and partner resources here.
cancel
Showing results for 
Search instead for 
Did you mean: 
Parikhharshal
Creator III
Creator III

Shows null values for tExtractJasonFields

Hi there

 

I have designed my job like below which shows result as NULL. I am sure my tExtractjsonFields is not configured properly.

 

0683p000009M134.png

 

Trying to extract Json (as shown below) using URL:

 

[
  {
    "assignment_id": 1234,
    "title": "Assignment 1",
    "points_possible": 10,
    "due_at": "2012-01-25T22:00:00-07:00",
    "unlock_at": "2012-01-20T22:00:00-07:00",
    "muted": false,
    "min_score": 2,
    "max_score": 10,
    "median": 7,
    "first_quartile": 4,
    "third_quartile": 8,
    "module_ids": [
        1,
        2
    ],
    "submission": {
      "submitted_at": "2012-01-22T22:00:00-07:00",
      "score": 10
    }
  },
  {
    "assignment_id": 1235,
    "title": "Assignment 2",
    "points_possible": 15,
    "due_at": "2012-01-26T22:00:00-07:00",
    "unlock_at": null,
    "muted": true,
    "min_score": 8,
    "max_score": 8,
    "median": 8,
    "first_quartile": 8,
    "third_quartile": 8,
    "module_ids": [
        1
    ],
    "submission": {
      "submitted_at": "2012-01-22T22:00:00-07:00"
    }
  }
]

 My tExtractJsonFields is configred as below:

0683p000009M139.png

 

Can someone please guide what am I doing wrong?

 

Thanks

Harshal.

 

Labels (4)
48 Replies
Parikhharshal
Creator III
Creator III
Author

@rhall: thanks for your reply. Whatever you have that will just resolve part of my problem where I might reduce one step in job.

Now that I have code, I am going to read as string from trest client then call routine/function(this manipulates whole json data and converts into csv) in javaflex and then from tinput delimited I will read csv file to put it into db.

Let me know your thoughts or is there any way I can improve the job by reducing steps.
Anonymous
Not applicable

Are you saying that this helped you get hold of the JSON string? I'm not sure I understood what you meant in your post. Do you have a colleague who can help with the Java? I ask because it is very difficult to debug code remotely. It might make more sense to work with a colleague on this to share skills

Parikhharshal
Creator III
Creator III
Author

@rhall: yes that’s correct. I have a colleague who is helping on java. As it’s beyond my capability.

But keeping java code aside, from the talend design do you see anything what I’m trying to achieve? Like any other space for improvement in terms of job design or this is the way you reckon?
Anonymous
Not applicable

Get the job working and then come back if you have any questions on potential improvements. At the moment, there is nothing to improve.

Parikhharshal
Creator III
Creator III
Author

@rhall: This is how my job looks like:

 

0683p000009M1bG.png

 

Where tJavaflex looks like this:

 

Analytics analyticsAPI = new Analytics();

analyticsAPI.parseJsonData(row2.string, (String) globalMap.get("tempFileURI",student_id,sis_user_id,course_id,course_code));

 

It actually calls routine Analytics and returns the output in .csv file which I am then putting into S3 using PutLadedData. Once this job is successful then another subjob runs and copies all the files from S3 to Redshift using below steps:

 

0683p000009M1Fu.png

 

Is there anything I can change or seems correct?

Anonymous
Not applicable

It looks OK, but if you are running the job that writes the data to a file in parallel, you will have to make sure that it is one file with a unique filename per parallel iteration. You cannot write to the same file in parallel.

Parikhharshal
Creator III
Creator III
Author

@rhall: Thanks for your reply. Yes I’m creating unique file (adding course id and student id in file name) per parallel process.
Anonymous
Not applicable

If it works, then you're good 🙂

Parikhharshal
Creator III
Creator III
Author

@rhall: yes it works fine and this way I’m able to run 25 parallel iterations which is quite good. This reduced my job time from 1 and half hour to 12 min 😀.