Do not input private or sensitive data. View Qlik Privacy & Cookie Policy.
Skip to main content

Announcements
Qlik GA: Multivariate Time Series in Qlik Predict: Get Details
cancel
Showing results for 
Search instead for 
Did you mean: 
Anonymous
Not applicable

Data source service deploy on runtime

Hello.

I've created datasource.xml file to store many connections and use "specify a data source alias" in studio.

datasource.xml file looks like:

<?xml version="1.0" encoding="UTF-8"?>
<blueprint xmlns="http://www.osgi.org/xmlns/blueprint/v1.0.0"
    xmlns:cm="http://aries.apache.org/blueprint/xmlns/blueprint-cm/v1.1.0">


    <bean id="ds1" class="com.microsoft.sqlserver.jdbc.SQLServerConnectionPoolDataSource">
	<property name="URL" value="..." />
        <property name="user" value="..." />
	<property name="password" value="..." />
    </bean>

    <bean id="dspool1"
        class="org.apache.commons.dbcp.datasources.SharedPoolDataSource" destroy-method="close">
        <property name="connectionPoolDataSource" ref="ds1" />
        <property name="maxActive" value="20" />
        <property name="maxIdle" value="5" />
        <property name="maxWait" value="-1" />
    </bean>

    <service ref="dspool1" interface="javax.sql.DataSource">
        <service-properties>
            <entry key="org.talend.esb.datasource.name" value="name1" />
            <entry key="osgi.jndi.service.name" value="jdbc/name1" />
        </service-properties>
    </service>
.....another service
.....
</blueprint>

And everything works fine when I deploy datasource.xml by placing it in container/deploy.
But I want to start this service on startup runtime.
So I'm creating featureRepository and I don't know how to define this service.
I was able to add and auto start other bundles I needed in this way:

<feature name="dataSourceFeature" version="1.0">
  <bundle>wrap:mvn:commons-dbcp/commons-dbcp/1.4</bundle>
  <bundle>wrap:mvn:com.microsoft.sqlserver/sqljdbc4/4.0</bundle>
</feature>

and then add dataSourceFeature to featuresBoot.

I don't know how to start datasource.xml manually.

When I run on Karaf console bundle:install and put mvn url I've got 
Error executing command: Error installing bundles:
Unable to install bundle mvn:com......./datasource/0.3/xml: org.osgi.framework.BundleException: Error occurred installing a bundle.

Labels (1)
  • v7.x

1 Solution

Accepted Solutions
Anonymous
Not applicable
Author

I *think* you will need to have the XML file in an artefact repository like Nexus. Then configure your Nexus in the file I mentioned last time. Once you have done that, you should be able to refer to the artefact using MVN

View solution in original post

10 Replies
Anonymous
Not applicable
Author

Maybe I am misunderstanding the issue here, but if you have added that file to the deploy folder, it will start when the runtime starts automatically

Anonymous
Not applicable
Author

You are right but I dont want to keep festures in deploy. I want to
configure it in festures boot and always on startup get the latest version
from maven repository. And finally place talned esb in docker. So every
time I restart docker Ive got all needed festures in latest version
Anonymous
Not applicable
Author

I see. Do you have the maven repository you are using to hold the datasource.xml listed in this file.....

etc\org.ops4j.pax.url.mvn.cfg

I don't know that this is your problem, but this is my first thought now I understand what you are trying to do. 

 

By the way, what you are doing sounds like an interesting approach. When you get it working, how would you feel about writing a blog on how you have done this? I think this would be a very interesting topic

Anonymous
Not applicable
Author

Yes I've got it.

And as I wrote I can automatically deploy on statrup feature dataSourceFeature, but this feature contains two bundle which are jar files:commons-dbcp and sqljdbc .
The problem is I don't how to define feature containing bunlde with datasource.xml. Because it's xml file?

I don't know even how to do this on console.
I've tried 

bundle:install mvn:com.../datasource/0.3/xml

but I'm getting error:

 

 

Error executing command: Error installing bundles:
        Unable to install bundle mvn:com.../datasource/0.3/xml: org.osgi.framework.BundleException: Error occurred installing a bundle.

That's why I don't know how to create feature and place in it xml file with data sources.

 

Anonymous
Not applicable
Author

I *think* you will need to have the XML file in an artefact repository like Nexus. Then configure your Nexus in the file I mentioned last time. Once you have done that, you should be able to refer to the artefact using MVN

Anonymous
Not applicable
Author

Probably you are right.

Because all maven artifacts are treated as archive I think I must create maven project which create a jar or other archive containing my xml configuration.

Thanks

Anonymous
Not applicable
Author

I think you said that you are using Docker. Here is a Docker-compose.yml for a quick test with Nexus.....

 

version: '3'

services:
      
  nexus-talend:
    image: sonatype/nexus3
    volumes:
      - ./nexus-data:/nexus-data
    ports:
      - "8081:8081"
Anonymous
Not applicable
Author

Hallo again.
I was managed to achieve what I wanted.
I created from datasource.xml file jar artifact and upload it to my nexus reposirity.
I added my data source bundle to featuresBoot and everything works fine on Runtime using Talend Open Studio.
But now we moved to Talend licensed version.
We choose hybrid version so we have runtime on premises and management in cloud.
Now I wonder how I should manage database connections?
Can I somehow publish my jar with datasources outside the studio to cloud?

Or is there a way to define and manage database connections in Management Console in cloud?
I need to use few connection (MS SQL) and change them depending on environment.
What is best practise?

Anonymous
Not applicable
Author

If your runtime is on-premises, then you should be able to do this in the same way as before. There may be some slight differences, but ultimately it should be the same. You will not need to pass connection information to the Cloud since that is simply where your routes/services are controlled from. They are actually running locally.