Skip to main content
Announcements
Qlik Connect 2024! Seize endless possibilities! LEARN MORE
cancel
Showing results for 
Search instead for 
Did you mean: 
AkbarShaikh
Contributor III
Contributor III

Autofit Columns when stroing a file in Excel

Hi, 

I am using STORE function to extract data from a QVD and I am saving in a CSV format in excel. The data has a 'Description' Field which is distorting the report when it is stored in excel. Since the Description field has too much text, it is reading every new line as a new cell in excel. If I remove the Description field, the data is stored correctly in excel. 

Can someone please help me AUTOFIT the Description column in Qlik before I use the STORE function.

//*******************//

Temp_Table:
LOAD

KeyNumber,
Opened_Date,
Short_Description,
Description,
Close_Notes

FROM $(vQVDDataPath)\ test.qvd(qvd)
WHERE Opened_Date >= $(vDataLimit);
STORETemp_TableINTO \\Qlikview\Projects\Incidents Extration Test.csv (txt);

//**************//

Any help in really appricated.

Regards,

a

6 Replies
marcus_sommer

How does the data look like if you opened the csv not with Excel else with an editor like Notepad++ ? I assume it's not really an Excel issue else that your description-field contained some line-breaks and/or any other special-chars. If so you could remove them with purgechar() or replace them with any other char / spaces with replace().

- Marcus

AkbarShaikh
Contributor III
Contributor III
Author

Hi Marcus,

Thank you for replying, apologies for a delayed response.

I did try the PurgeChar option and it successfully removed the characters from the Description field, but the while storing the data in excel, it is still appearing in the key field (Inc_Number)

Inc_Number has alphanumeric data, like this one: INC000110. Is this the reason for the 'STORE' function giving an error? 

When I run the script without the Description field, it gives a correct output.

Here's my script:

Temp_Table:
First 10000
LOAD Number,
Opened_Date,
Resolved_Date,
Closed_Date,
Updated_Date,
State,
Category,
Classification,
Short_Description,
Description,
PurgeChar (PurgeChar (PurgeChar (PurgeChar(Description, '*'), '-'), ','), '*') as Test_Description
FROM $(vQVDDataPath)\SE_Full.qvd (qvd);

STORE Incidents INTO $(vPath)\Extration_Data.csv (txt);

 

It would be really helpful if you provide a solution to this. 

Regards,

a

marcus_sommer

You don't need to nest the purgechar() multiple times else you could apply multiple chars at once. Further I'm not sure if your specified chars really cause your issue. If I understand your description right then it's rather caused from any line-breaks and/or tab-chars respectively which might be used as field-delimiter. This means I would think it should be rather look like the following one:

purgechar(Description, chr(10) & chr(13) & chr(9) & chr(44)) as Description

Not visible chars are difficult to assign  - you may copy&paste them from any source - but better would be to use the function chr(). Also you mustn't store the origin field, too and you need to apply this kind of adjusting to each field which has a similar content. If this suggestion doesn't returned the expected results you need to check the txt-files within an editor like notepad++ to see the whole content of the file.

- Marcus

AkbarShaikh
Contributor III
Contributor III
Author

Hi Marcus,

You are right about the cause - primarily it seems like it is due to line-breaks and tab chars and the script you shared did help in reducing the characters, but the problem still exists.

When the data stores in excel, the first tab which is a (Key Field: example - INC00010) still displays a few lines from the Description field.  I am trying to use Chr (39) for each record in the Description field, and then Purgechar it with your script in another resident load, but I am getting an error in the script itself.

Is that the way to go about this issue or I am completely off the track? 

Regards,

a

 

marcus_sommer

Like already mentioned you need to look at the results with an editor like Notepad++ to ensure that you could see and identify each single char. Also only with this kind of view you could see how the data really are because Excel won't show them in this way else it will interpret the data with a lot of own "intelligence" - which is nearly not customizable (this behaviour leads often to more disadvantages then to benefits).

Further you mentioned the use of quotes - chr(39) - as fas as possible I suggest to avoid them because usually they cause more problems then they solve.

Nevertheless I would try to load these data within a custom-mode into Excel. This means to prevent an auto-interpreting and using the Excel wizard of "text-in-columns" (just opening a new empty Excel, write any content in "A1" like 'xyz', using the mentioned wizard and applying there any field-delimiter which is different to your txt, nothing will happens unless Excel will keep this delimiter in this session unless a different one is chosen, now you could copy&paste the txt-content to "A1" and opening the wizard again - step by step through all options and watching what happens within the pre-view window - sometimes this worked better as the auto-interpretation or it might give you some hints about what it's struggling).

- Marcus

Brett_Bleess
Former Employee
Former Employee

Akbar, were you able to get things sorted using Marcus' advice?  If so, do not forget to return to your thread and either mark his posts that helped with the Accept as Solution button that did help, or if you figured something else out, consider posting that and then use the button to mark that post...  If you are still working on things, leave an update post with where you are at this point, but hopefully using NotePad++ or another text editor allowed you to see what was causing the issue.

Regards,
Brett

To help users find verified answers, please do not forget to use the "Accept as Solution" button on any post(s) that helped you resolve your problem or question.
I now work a compressed schedule, Tuesday, Wednesday and Thursday, so those will be the days I will reply to any follow-up posts.