Do not input private or sensitive data. View Qlik Privacy & Cookie Policy.
Skip to main content

Announcements
Qlik Open Lakehouse is Now Generally Available! Discover the key highlights and partner resources here.
cancel
Showing results for 
Search instead for 
Did you mean: 
_AnonymousUser
Specialist III
Specialist III

How to Load data into Cassandra Output Using Spark

Hi,
     I want to load data into Cassandra processed through Spark components.
    Can you please give me suggestion that how to load data into Cassandra through spark components Using Talend?
Regards,
Kiran Bhoknal
Labels (2)
10 Replies
Anonymous
Not applicable

Hi Kiran Bhoknal,
Please take a look at  jira issue: https://jira.talendforge.org/browse/TBD-1749.
Best regards
Sabrina
_AnonymousUser
Specialist III
Specialist III
Author

Thanks Sabrina...

Talend has resolved my issue. Please find below JIRA link-




I just want to test that updated component.  I have checked with Talend's GITHUB repository but I don’t found updates. Is it available in TOS 6.0.0 M4 ?


Regards,
Kirab B.
Anonymous
Not applicable

Anonymous
Not applicable

Hi,
I just want to test that updated component. I have checked with Talend's GITHUB repository but I don’t found updates. Is it available in TOS 6.0.0 M4 ?

We will keep you posted as long as this feature is avaible.
Best regards
Sabrina
Anonymous
Not applicable

any updates Sabrina?
Thanks
Kiran B
Anonymous
Not applicable

Hi Kiran B,
The status of this jira issue is " Development done".
For test only, you can check it on 6.0.0 M4. Please see announcement post :https://www.talendforge.org/forum/viewtopic.php?id=43601
Best regards
Sabrina
Anonymous
Not applicable

Hi Sabrina,
               I have checked with 6.0.0 M4 but that component is not present in Palette.
Please let me know once it available.

Thanks,
Kiran B.  
Anonymous
Not applicable

Hi,
As a matter of fact, this new feature is only available in Talend Enterprise Subscription Version not TOS 6.0.0 M4.
Best regards
Sabrina
Anonymous
Not applicable

Okay...
Thanks for Reply