NOTE: Running services
NOTE: put these lines in /etc/hosts.
127.0.0.1broker127.0.0.1connect127.0.0.1control-center127.0.0.1ksql-datagen127.0.0.1ksqldb-cli127.0.0.1ksqldb-server127.0.0.1rest-proxy127.0.0.1schema-registry127.0.0.1zookeeper
NOTE: to register schema and receive the ID.
$ curl -X POST -H"Content-Type: application/vnd.schemaregistry.v1+json" --data-binary @AAA.json http://schema-registry:8081/subjects/AAA-value/versions=> {"ID":"53"}
NOTE: test producer and consumer with schema
docker-composeexec schema-registry kafka-protobuf-console-producer --bootstrap-server broker:29092 --property value.schema.id="53" --property schema.registry.url="http://schema-registry:8081" --topic AAA<= {"message":"test"}
docker-composeexec schema-registry kafka-protobuf-console-consumer --bootstrap-server broker:29092 --from-beginning --property value.schema.id="53" --property schema.registry.url="http://schema-registry:8081" --skip-message-on-error --topic AAA=> {"message":"test"}
Everything should be fine.
When i using the protobuf in nodejs, the serialization is off and the above consumer shows errors. I test with two library, with google protobuf and protobufjs both give me the same buffer. I couldn't find anyway to inspect data inside the kafka.
NOTE: to generate AAA_pb.js, if you like you can skip.
$ brew install protobuf$ protoc --proto_path=. --js_out=import_style=commonjs_strict:. AAA.proto
$ brew install node@12 yarn$ yarn install
# google protobuf sample$ node AAA# protobufjs sample$ node BBB