Unlock a world of possibilities! Login now and discover the exclusive benefits awaiting you.
Hi, All
I'm using Talend Open Studio Big Data 6.3, and I wanna use DynamoDB in my job.
My job is pretty simple.
Read file, and extract it, then insert to DynamoDB.
My AWS DynamoDB upload capacity is 2000.
So I think this job can insert 2000 rows /s. But It's actually 31 rows / s.
Is there anyway past way to inset into Dynamo?
Thank you in advance..
Hello,
In advanced setting of tDynamoDBoutput component, there are " Read Capacity Unite" and " Write Capacity Unite", which specify the number of read/write capacity units. For more information, please refer to
Best regards
Sabrina
Hello
I set it up 2,000 for each DynamoDB and tDynamoDBOutput component... and it's same...
Thank You for Reply.
Hi,
I'm not sure this issue is resolved or not. I have similar issue and I believe it is a problem with Talend tDynamoDBOutout component. It gives a strange error with "Action on Data" is set as "Insert".
Try to use the "Action on Data" value as "Update". It works for me and I'm able to use this setting to Update as well as Insert new records in my DynamoDB table.
Thanks,
Siva
When "Action on Data" is set as "Insert",
Starting job BD_Job at 13:53 28/04/2019.
java.lang.VerifyError: Inconsistent stackmap frames at branch target 1601
Exception Details:
Location:
local_project/bd_job_0_1/BD_Job.tRowGenerator_1Process(Ljava/util/MapV @1601: ldc_w
Reason:
Type top (current frame, locals[17]) is not assignable to 'com/amazonaws/services/dynamodbv2/document/spec/PutItemSpec' (stack map, locals[17])
Current Frame:
bci: @788flags: { }locals: { 'local_project/bd_job_0_1/BD_Job', 'java/util/Map', integer, 'java/lang/String', 'java/lang/String', 'java/util/Map', integer, 'local_project/bd_job_0_1/BD_Job$row1Struct', 'local_project/bd