Skip to main content
Announcements
Accelerate Your Success: Fuel your data and AI journey with the right services, delivered by our experts. Learn More
cancel
Showing results for 
Search instead for 
Did you mean: 
FSternberg
Contributor III
Contributor III

Write error logs to file - after unarchive and load to database

Hello

please, could you help me with one better and simple solution?

I need include one log in two steps, during the unzip some files in one directory and during the import of Csv files for one database.

My project basically is unzip Csv files and import them for database.

but I like to have logs and can to identify the files that could not be unzipped and imported.

thanks a lot

Labels (3)
6 Replies
Anonymous
Not applicable

Hi

Take a look at tLogCatcher component, this component captures the error during the job execution. In your case, I think you need tFileList to iterate each file, please refer to the component documentation and learn these components.

Let me know if you have any other questions.

 

Regards

Shong

FSternberg
Contributor III
Contributor III
Author

Hello, thanks by reply.

I am using tFileList and only with LogCatcher I will get to save the logs about the process to one txt file?

One other question, if I enable the option of logs on the project level I need to use LogCatcher too?

could you give me one print with the example ? My project is unzip some files, read the Csv files, Map the schema and load to sql server, same schema.

Anonymous
Not applicable

If you enable the option of logs on the project level, you don't need to use tLogCatcher component in the job. A simple job looks like:

tFileList--iterate--tFileUnarchvie--oncomponentok--tFileInputDelimited--main--tMap--out1--tMSSQLOutput

 

Hope it helps!

 

Regards

Shong

FSternberg
Contributor III
Contributor III
Author

 

Please check if it is correct... I think that is better for you understand

 

STEP 1 (UNZIP ALL FILES)

 

tFileList to tFileUnarchive >> Iterate connection

 

STEP 2 (LOAD ALL CSV´S FILES TO DB)

 

PARENT JOB

 

Contexts:  

filename, type String

tablename, type String

directory, type Directory

 

tFileList  (Specify the directory the Value field of variable directory)

tIterateToFlow (Add column named filename and the Value field of the Mapping table with the CURRENT_FILE (variable generated by the tFileList)) 

tFixedFlowInput (Add two columns file_name and table_name and configure file-to-table mappings.

 

tMap 

1-Drag the filename column of table row1 (from the tIterateToFlow), drop it onto file_name column of table row2 (from the tFixedFlowInput) to join the two tables for file name look-up.

2-Drag the filename column of table row1 and drop it onto the filename column of table out.

3-Drag the table_name column of table row2 and drop it onto the tablename of table out

4-Match Model to Unique match and Join Model to Inner Join

 

tJavaRow (codes: context.tablename = out.tablename and context.filename = out.filename.

 

tRunJob (select the child Job you want to call from the Repository)

 

CHILD JOB

 

Contexts:

 

filename, type String

tablename, type String

directory, type String

 

TFileInputDelimited 

add column data and set type to Dynamic

File name/Stream - set variable to context.directory+context.filename  

 

tDBOutput 

 

Connection details ( including the host name or IP address, the port number, and database name).

Fill Table field with context variable defined for the table name+context.tablename.

Action on table list, select Default.

Action on data list, select Insert.

Click Sync columns to ensure the schema is the same as input component: single column named data, type Dynamic.

 

Anonymous
Not applicable

I think the job design is OK, let me know if you have any further questions or error.

 

 

FSternberg
Contributor III
Contributor III
Author

I will test it. Let you know