Unlock a world of possibilities! Login now and discover the exclusive benefits awaiting you.
Hi Community,
I am trying to publish dataset to AWS S3 from Qlik Catalog. It works if Entity has less number of rows.
I try to publish entity with 17M rows and 20 fields and it get failed with below error message,
Task DataShippingHandler failed: ExecException: Stopping execution on job failure with -stop_on_failure option
Does anyone has faced this issue for large datasets?
What size of entity can be publish in terms of # of rows & Fields. Des anyone know limitations of Qlik Data Catalog Publish feature?
Thanks,
Akshaye
Hello Akshaye,
There are a number of different factors that contribute to the limitations in ingestion of large datasets. Is the QDC product the full hadoop version or of the single server variety? If you're wanting to ingest / catalog large datasets, the hadoop version is advisable. That said, for the most part : limitations in large file publishes are constrained by QDC product type and then by infrastructure(disk size, network,etc.). There are no hard caps in the product itself.
You might consider opening a support case, in that they could help you triangulate on the cause of this error faster. Hope that helps.
David
We have Single Node architecture where we have dedicated Web Server and Database server..