Skip to main content
Announcements
Join us at Qlik Connect for 3 magical days of learning, networking,and inspiration! REGISTER TODAY and save!
cancel
Showing results for 
Search instead for 
Did you mean: 
Peachman97
Partner - Contributor III
Partner - Contributor III

API calls to Amazon EC2 API endpoint

Hi,

I Require the ability to make API calls out for the Amazon EC2 API endpoint for a project. However I am struggling massively to make any headwind. I am mainly struggling with setting up the API call to any capacity in Qlik using any of the current Blocks available to me. I have created a custom code block with the node.js code to make the API call, but this is returning a DNS issue.  I am not sure of the best solution to use, or if there is the ability to achieve what I'm trying to do. 

Has anyone been able to successfully make an API call to AWS in any capacity (not AWS S3 as there are connectors for that specifically that make that easier)

 

 

Labels (4)
9 Replies
Shai_E
Support
Support

Hi @Peachman97 ,

I dont think that the custom code block can help you here, if i remember correctly you wont be able to import outside modules into your script, only built in functions and modules of the programming language.

Did you try one of the other connectors like: API Key or the block Call URL?

 

Peachman97
Partner - Contributor III
Partner - Contributor III
Author

Hey @Shai_E ,

You can create the authentication logic without the need of external modules within the custom code block so that's not the problem there. it was the DNS issue mentioned above. , the modules build with 'node' are suitable enough for it to work and build the auth logic. 

I have tried every connector there is available to me that is to do with URLs/ APIs none can get this to work because of the AWS signature generation required.  (Full docs on AWS side here) Authenticating Requests (AWS Signature Version 4) - Amazon Simple Storage Service.

 

 

 

 

Peachman97
Partner - Contributor III
Partner - Contributor III
Author

@Shai_E  Did you have any further thoughts on this at all?

Shai_E
Support
Support

Hi @Peachman97 ,

Sorry but no.

@AfeefaTk  you know if something like this is possible?

AfeefaTk
Support
Support

Hi @Peachman97 

Sorry, as mentioned by @Shai_E I don't see any way to connect with Amazon EC2.

Generic connectors cannot be used for EC2 since AWS authentication works differently.

That being said I'd like to hear about the use case for this connector, if there's something valid there then we might be able to do this.

Meanwhile, I would suggest creating a feature request for the Amazon EC2 connector in the ideation forums.

https://community.qlik.com/t5/Get-Started/Ideation-Guidelines-How-to-Submit-an-Idea/ta-p/1960234

Thanks!!

Afeefa TK

Peachman97
Partner - Contributor III
Partner - Contributor III
Author

Hey @AfeefaTk  Thanks you for input, this is as I expected, I had already submitted an ideation for this idea here 

 

My use case is being able to get information from the API to load into Qlik to create a monitor type app, as well as storing historic information into QVDs as some APIs only retain current information. My specific Project this is for is for cost analysis and understanding costs. 

AfeefaTk
Support
Support

Hi @Peachman97 

Thanks for submitting the feature request.

Meanwhile, I will share the use case internally within the team and see if can push to have the connector in place.

I will get back to you with further updates.

Best Regards

Thanks

AfeefaTk
Support
Support

Hi @Peachman97 

Our connector team thinks that automation might not be a good solution for this scenario as it's more of a data-loading use case. Perhaps this post would be useful for you. 

https://community.qlik.com/t5/Design/Qlik-Partner-Engineering-Team-releases-Qlik-AWS-Cost-Explorer/b...

Thanks

Peachman97
Partner - Contributor III
Partner - Contributor III
Author

Hey @AfeefaTk  Thanks for the feedback, 

Perhaps I wasn't specific enough in my use case, so ill delve into it some more,  I have a Databricks space that will spin up AWS EC2 compute instances & Use S3 buckets to store the unity catalog, I want to get the following information from AWS so I can link the following in the Qlik dashboard I'm building

1) Get the EC2 Compute Details to find what ID relates to the Databricks Warehouse IDs to create  a link between them, and grow this relationship over time

2) Get the Storage objects size & costings that are being used from the Databricks end.

There are 3 APIs that are required the EC2/S3/CE  End points.  as mentioned there are no options in the current API connections to generate the secret require for the AWS authentication using the public and secret keys that can be generated. 

A Key Requirements from the project is being able to role this out with minimal effort AWS end, and utilising the API is currently seen as the way forwards to onboard the data into Qlik. The alternative is creating various elements and spinning up databases in AWS and pulling from a current connectors to get the data, but this setup is more costly to the EU than the API usage with the analysis we have completed.

Edit: Final addition, All the APIs required I have built and tested in Postman that will return the data I require. its purely the authentication that's the issue.