Do not input private or sensitive data. View Qlik Privacy & Cookie Policy.
Skip to main content

Announcements
Qlik and ServiceNow Partner to Bring Trusted Enterprise Context into AI-Powered Workflows. Learn More!
cancel
Showing results for 
Search instead for 
Did you mean: 
Manigandan_Janarthanan

Need to write 30 sheets in Excel

Hi All,

im pulling one SQL table from Sql server database.

steps need to be followed.

  1. using where conditions i need to segegrate the data into 30 sheets in excel sheets.
  2. Note: each sheets holding nearly 20k records in each sheets

Steps taken

  1. used components - tfileexcelsheetoutput
  2. prejob and postjob Talend are used
  3. subjob are created to overcome but no use of it.

im not able to write all the excel sheets.

complexity issue are:

  1. memory management
  2. hang state (i.e. job not completing)
  3. more than 20 minutes eventhough the job wont complete.

Labels (3)
7 Replies
Manigandan_Janarthanan
Author

Please some one suggest the better idea to overcome the issue . Eagerly looking for any positive response.

Anonymous
Not applicable

@manigandan J​, can you share a screenshot of your job? I want to see how you write each part of data into a different sheet.

Regards

Shong

Anonymous
Not applicable

If you process the data iteratively, check the 'enable parallel execution' box, see below. This option will improve

the performance for large volumes of data.

 

0695b00000WubWsAAJ.png 

 

 

Manigandan_Janarthanan
Author

simply read one same table from Database and putting in one excel file contains several sheets

0695b00000WubZ8AAJ.png0695b00000WubZDAAZ.png 

one way i have tried

 

 

Another way i dropped all the data in tbufferoutput and using tbufferinput read the data again and again

 

0695b00000WubakAAB.png0695b00000WubbEAAR.png

 

 

Note: im using same excel file put in the different sheets (i.e. 30 sheets) as per business requirements.

 

 

how to control memory utilization. each quarter the data is increasing and needs to process periodically to provide the reports to business and to validate (i.e. cross check ) the data.

 

Could you please suggest some ideas and great helpful for me

 

 

 

 

Manigandan_Janarthanan
Author

yes i agree but each sheets we have different data. Here i think we cant use Tflowtoiterate.

 

because each sheets have different columns and also some metadata add in tmap using expressions to populate it.

 

its something i should handle better way 30 sheets in one excel file and also each quarter data is increasing.

 

for this quarter i can compare 1 million records and table columns having 67 fields (i.e. data is huge)

 

may be next quarter it will be increase

Manigandan_Janarthanan
Author

im getting the below error always

 

INFO: Extending cell ranges for conditional formats. Use formats from row: 2

INFO: Extending cell range for tables...

INFO: Create new extended DataValidations (last written row: 3943), number of validations: 0

#

# There is insufficient memory for the Java Runtime Environment to continue.

# Native memory allocation (malloc) failed to allocate 1048592 bytes for Chunk::new

# An error report file with more information is saved as:

# C:\Users\Manigandan.J\workspace\.repositories\1866045447\master\D365FINOPS\poms\jobs\process\Reconciliation\Product_Reconjobs\match_products_recon_optimization_0.1\src\main\ext-resources\hs_err_pid36152.log

INFO: Extending cell ranges for conditional formats. Use formats from row: 2

INFO: Extending cell range for tables...

INFO: Create new extended DataValidations (last written row: 9694), number of validations: 0

#

# Compiler replay data is saved as:

# C:\Users\Manigandan.J\workspace\.repositories\1866045447\master\D365FINOPS\poms\jobs\process\Reconciliation\Product_Reconjobs\match_products_recon_optimization_0.1\src\main\ext-resources\replay_pid36152.log

 

 

0695b00000WubiKAAR.png 

 

Anonymous
Not applicable

@Maxime GRAZZINI​ , Sorry for the delay in replying. I was on vacation the other day. From the error, there is not enough memory available for the job to continue execution. Try the following way and see if the same error still occurs.

1 Make sure you have enough available memory for the big volume of data processing.

2 Split the job to several child jobs, each child job write some of sheets, use tRunJob to call the child jobs and check the 'use an independent process to run subjob' box.

3 Use specific JVM arguments as you did in each child job.

 

Regards

Shong