Movatterモバイル変換


[0]ホーム

URL:


Skip to content

Navigation Menu

Sign in
Appearance settings

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Sign up
Appearance settings

Route over kafka-logger plugin to integrate with SSL enabled Kafka#12659

myselfmayur1234 started this conversation inGeneral
Discussion options

Hi All,

I am trying to build a route using kafka-logger plugin. Supplied all the required attributes mentioned in thedocumentation.Kafka is SASL_SSL enabled and supplied all the certificates (client_cert, client_key, ca_cert) in the plugin configuration.

When i trigger the route using curl command, getting error as "502 Bad Gateway".
In theerror.log file foundSSL_read issue as:

SSL_read() failed (SSL: error:0A000412:SSL routines::ssl/tls alert bad certificate:SSL alert number 42 )

Options i tried to verify connectivity from APISIX Host are:

  1. verify certificate using openssl --> Verification OK
  2. Using Python script, supplied all required parameters to connect with SSL enabled Kafka topic and able to publish data.

Request someone to please suggest any clue on this.

You must be logged in to vote

Replies: 2 comments 1 reply

Comment options

Could you please provide a minimal reproducible environment using Docker Compose? This would help us troubleshoot the issue.

You must be logged in to vote
1 reply
@myselfmayur1234
Comment options

@Baoyuantop Can you please let me know, how to supply SSL certificates which can be readable by kafka-logger plugin. Thats all i wanted to know.

Comment options

Thanks for your response.
I do not have docker compose. The installation of APISIX is over Linux Server. I cannot bring the configuration out due to security.

Below is the Route i created:

`curlhttp://127.0.0.1:9180/apisix/admin/routes/5 -H "X-API-KEY: $admin_key" -X PUT -d

{
"plugins": {
"kafka-logger": {
"brokers" : [
{
"host" :"127.0.0.1",
"port" : 9092,
"sasl_config": {
"enable": true,
"user":"XXX",
"password":"XXX",
"mechanism":"PLAIN"
},
"ssl": true,
"ssl_verify":true
}
],
"kafka_topic" : "test2",
"key" : "key1",
"batch_max_size": 1,
"name": "kafka logger"
}
},
"upstream": {
"nodes": {
"127.0.0.1:1980": 1
},
"type": "roundrobin"
},
"uri": "/hello"
}`

Not sure where to supply client certificate, private key and trusted certificates. I tried following open-resty-kafkatodocument to set certificates using below directives:

lua_ssl_trusted_certificate lua_ssl_certificate_key lua_ssl_certificate

You must be logged in to vote
0 replies
Sign up for freeto join this conversation on GitHub. Already have an account?Sign in to comment
Category
General
Labels
None yet
2 participants
@myselfmayur1234@Baoyuantop

[8]ページ先頭

©2009-2025 Movatter.jp