Skip to main content
Announcements
July 15, NEW Customer Portal: Initial launch will improve how you submit Support Cases. IMPORTANT DETAILS
cancel
Showing results for 
Search instead for 
Did you mean: 
GarethPenrose
Contributor
Contributor

Talend job hanging "starting"

Have a redshift output component pointing at a table with 37 million rows in it, being fed by a tMap and several lookups. When I start the job it doesn't even try to run the input query or the tMap, it just loads the lookup tables and then says "starting" above the flow to the tRedshiftOutput component.

 

This issue has been occurring all over the place across all my jobs, they just refuse to run.

 

Anyone ever heard of this or know what my problem might be?

 

Image is "Live Statistics" from TAC.

 

0683p000009LvCj.png

Labels (2)
1 Solution

Accepted Solutions
Anonymous
Not applicable

Hi,

It seems the DB you were connecting was unresponsive. Did you try to restart your DB to see if it work?

Is it OK with you that output your data into a file instead of DB?

Best regards

Sabrina

View solution in original post

11 Replies
GarethPenrose
Contributor
Contributor
Author

And here is an image showing the current status:

 

0683p000009LvY0.png

Anonymous
Not applicable

Maybe check the checkbox "Die on error" on the tRedshiftOutput component and run it to see what is causing it to hang.

GarethPenrose
Contributor
Contributor
Author

Thank you for the suggestion. I gave it a go but no luck, it seems there is not an error because the component does not die and I do not get a message in the log, it still hangs.
GarethPenrose
Contributor
Contributor
Author

I put a system.out.println in the input query box and it won't even run that. No idea what it is doing.

Anonymous
Not applicable

Are you sure you have access from wherever you are running the job  to the redshift table?

Anonymous
Not applicable

Hi,

It seems the DB you were connecting was unresponsive. Did you try to restart your DB to see if it work?

Is it OK with you that output your data into a file instead of DB?

Best regards

Sabrina

GarethPenrose
Contributor
Contributor
Author

Writing to a file works.

 

It seems to be a problem with the database.

 

Thank you for the help.

Anonymous
Not applicable

I have exactly the same issue. Was this ever resolved?

GarethPenrose
Contributor
Contributor
Author

This was almost a year ago now!

 

I can't really say what exactly it was, all I've got is a few key things we've done since:

 

- Use tRedshiftOutputBulkExec instead of a standard tRedshiftOutput. This will write files to S3 and then use the COPY command to load the data into redshift.

- Make sure to use streaming in the mysqlInput advanced options (for large result sets).

- We moved away from tMap in favour of tJavaFlex and doing all the mapping programmatically where feasible.

 

Let me know if you are still stuck after trying some of this stuff and I'll see if I can think of anything else.