Skip to main content
Announcements
Introducing Qlik Answers: A plug-and-play, Generative AI powered RAG solution. READ ALL ABOUT IT!
cancel
Showing results for 
Search instead for 
Did you mean: 
suvbin
Creator III
Creator III

HadoopExecutionException: Too many columns in the line.

Hi Team,

In the task they have added global rule "added date column"  and while peforming full load , few tables got loaded , few didn't load, its giving error stating  "too many columns "  at target end . PFB error.   

Source : Sql server

target : synapse analytics

full load setting : drop and create

00014344: 2024-02-07T21:02:23 [TARGET_LOAD     ]E:  Failed (retcode -1) to execute statement: 'INSERT INTO [asdw-r5].[SCM].[group_Table] ( [group_seqno],[name],[TUTCDATE] ) SELECT [group_seqno],[name],[TUTCDATE] FROM [asdw-r5].[SCM].[ATTREP_EXT_5b061624_d3d6_3c42_8835_07da74ac5634_51];' [1022502] (ar_odbc_stmt.c:4996)
00014344: 2024-02-07T21:02:23 [TARGET_LOAD     ]E:  RetCode: SQL_ERROR  SqlState: 42000 NativeError: 107090 Message: [Microsoft][ODBC Driver 18 for SQL Server][SQL Server]HdfsBridge::recordReaderFillBuffer - Unexpected error encountered filling record reader buffer: HadoopExecutionException: Too many columns in the line. Line: 1 Column: -1 [1022502]  (ar_odbc_stmt.c:5003)

Labels (2)
4 Replies
DesmondWOO
Support
Support

Hi @suvbin ,

Thank you for reaching out to us. 

Could you verify the number of columns and table structure are matching?

Regards,
Desmond

Help users find answers! Do not forget to mark a solution that worked for you! If already marked, give it a thumbs up!
john_wang
Support
Support

Hello @suvbin ,

In the SQL there are only 3 columns, definitely the error message Too many columns in the line is caused by other reasons. One possibility is the data contains the as same chars string as the delimiter. I'm not sure what's the Replicate version you are running and what's the delimiter setting, by default in Qlik Replicate 2023.5 the delimiter string is "#$#". If you data happens contain the same string then it will be interpreted as new column rather than a single columns' data. Hence we got the above error.

Please try to use a different delimiter string (which does not appear in your data). the steps:

  1. Open Synapse target endpoint
  2. Go to the Advanced tab
  3. Open Internal Parameters
  4. Add a new parameter named $info.query_syntax.csv_delimiter
  5. Press <Enter> and set the parameter's value to a special string, eg @@$@@john_wang_0-1707392992728.png

     

Hope this helps.

John.

Help users find answers! Do not forget to mark a solution that worked for you! If already marked, give it a thumbs up!
deepaksahirwar
Creator II
Creator II

Hi Team,

Welcome! We’re delighted to have you in the Qlik Community portal. Thank you for bringing your question to us.

The error message “HadoopExecutionException: Too many columns in the line” typically occurs when the number of columns in the data does not match the schema definition. In your case, it seems like the global rule “added date column” might be causing an extra column to be added, which could be leading to this mismatch.

Here are a few steps you can take to troubleshoot this issue:

  1. Check the Schema: Ensure that the schema definition matches the data. The number of columns in the schema should match the number of columns in the data.
  2. Check the Global Rule: Review the global rule “added date column”. If this rule is adding an extra column, it could be causing the mismatch.
  3. Test with a Subset of Data: Try running the task with a smaller subset of data. If the task runs successfully, the issue might be with specific rows or columns in your data.
  4. Check for Special Characters: Special characters in the data can sometimes cause issues. Make sure your data does not contain any unexpected special characters.

 

If none of these steps resolve the issue, it might be helpful to reach out to Qlik Support for further assistance.

I hope this information helps! If you have any other questions, feel free to ask.

If our response has been helpful, please consider clicking “Accept as Solution”. 
This will assist other users in easily finding the answer. ‌‌

Best Regards,
Deepak

SushilKumar
Support
Support

Hello team,

 

If our response has been helpful, please consider clicking "Accept as Solution". This will assist other users in easily finding the answer.

 

Regards,

Sushil Kumar