Skip to main content
Announcements
Global Transformation Awards! Applications are now open. Submit Entry
cancel
Showing results for 
Search instead for 
Did you mean: 
apoorvasd
Creator II
Creator II

Loop through different folders in SharePoint

Hello Everyone,

I am trying to loop through excel files present in SharePoint using "for each" loop as shown below. But not sure why, the loop statement doesn't work and there is no data fetched into QlikView. Can anyone guide me here please?

for each FoundFile in filelist ('http:servername/processes/ab/abc/abcd/xy%20xyz%202017%20-%20Region%20Plant\FileName*.xlsx')

FileList:

load

A,

B

C

FROM $(FoundFile)

(ooxml...);

next FoundFile;


PS: Service account does have full access to this site in SharePoint.

Thank you.

10 Replies
Anil_Babu_Samineni

When you say share point? Are you getting data using single excel instead of multiple?

Best Anil, When applicable please mark the correct/appropriate replies as "solution" (you can mark up to 3 "solutions". Please LIKE threads if the provided solution is helpful
apoorvasd
Creator II
Creator II
Author

No there are multiple excel files.

Enriquemmesteo
Creator
Creator

Did you ever solve this? I have the same issue

marcus_sommer

A sharepoint isn't a classical file-storage and therefore the usual file-functions/operations won't work:

https://community.qlik.com/... &q=sharepoint%20 

- Marcus

williamandersson
Partner - Contributor III
Partner - Contributor III

Very late to the party on this one haha...

@marcus_sommer What would be some alternatives to Sharepoint?

I'm migrating from QlikView to Qlik Cloud and need to replace a normal folder connection (storing files on disk) with something else. Thought SharePoint would work, but as you mentioned above since it is not a normal file-storage, I'm unable to loop all my files with ex. For Each Next.

Do you know of any other replacements for normal disk storage?

marcus_sommer

Within the Qlik web-connectors is also a SharePoint connector which could be used to get lists of the available content and to it then properties like the name + dates and most important their unique id's which are needed to access the files. It's a rather ugly approach and I never dived deeper into it to develop stable and reusable loop-solutions. I don't know if this connector could be also used for the reversed direction and uploading/storing any data. I think if you would search more specific to this within the community respectively to any alternatively connectors you will find a lot of stuff.

Before this take a closer look on the implemented Qlik cloud-features. If I remember and conclude various postings correct are there in the meanwhile native read/write options against the SharePoint in the Qlik cloud respectively the newer SaaS releases. If they are also directly supporting advanced (loop) automation? Even if not I would start the investigation with it.

Further worth to be considered is to use a data-gateway to any target and not to store the data in the SharePoint or any other web-based cloud.

williamandersson
Partner - Contributor III
Partner - Contributor III

@marcus_sommer 
Yes just found out about the Meta Connector, seems like it could be of use for the case even though, as you mentioned, it might not be the prettiest of solutions.

It look like you can read a bunch of files from cloud with "*" quite easily using FileList, the tricky part is having folder hierarchies, since DirList cannot list available folders.

Can you read excel/csv files with a datagateway? That is new to me in that case, and might be worth looking into yes!

marcus_sommer

I think it should be possible with a gateway but I have no experience with them on the Qlik side. By MS Power BI it's possible with a gateway which is there a VM with an installed connector between the cloud and the user-system. I could imagine that a gateway in Qlik worked in a quite similar manner. By a huge traffic a gateway could become a bottleneck because it's an additional layer and the network and processing performance on all sides must be suitable.

Depending on the available data respectively how they are created/maintained/owned you may have an own table of content which is then taken to pick the wanted ones - and yes it will cost some overhead.

Beside of any file-stuff you may also consider to write the data into a data-base ... All of it depends also to the amount of data which should be stored. AFAIK a cloud-storage like SharePoint isn't suitable to read/write massive data ...

williamandersson
Partner - Contributor III
Partner - Contributor III

OK! Thanks for the input on this.

I will do some investigation and see where I land. It looks like the most obvious solution and more of a quick fix for now, would be to use both of the Sharepoint connectors where the Metadata one lists all available folders/files, the other reads from those paths found. And later we might opt to a more sustainable solution going forwards.

 

Thanks for the help!