Kafka Connect configuration is easy - you just write some JSON! But what if you’ve got credentials that you need to pass? Embedding those in a config file is not always such a smart idea. Fortunately with KIP-297 which was released in Apache Kafka 2.0 there is support for external secrets. It’s extendable to use your own ConfigProvider
, and ships with its own for just putting credentials in a file - which I’ll show here. You can read more here.
Kafka Connect connector secrets management 🔗
-
Set up your credentials file, e.g.
data/foo_credentials.properties
FOO_USERNAME="rick" FOO_PASSWORD="n3v3r_g0nn4_g1ve_y0u_up"
-
Add the
ConfigProvider
to your Kafka Connect worker. I run mine with Docker Compose so the config looks like this. I’m also mounting the credentials file folder to the containerkafka-connect: image: confluentinc/cp-kafka-connect:5.2.1 […] environment: […] # External secrets config # See https://docs.confluent.io/current/connect/security.html#externalizing-secrets CONNECT_CONFIG_PROVIDERS: 'file' CONNECT_CONFIG_PROVIDERS_FILE_CLASS: 'org.apache.kafka.common.config.provider.FileConfigProvider' […] volumes: - ./data:/data
Restart your Kafka Connect worker
-
Now simply replace the credentials in your connector config with placeholders for the values:
-
Before:
'{"name": "source-activemq-01", "config": { "connector.class": "io.confluent.connect.activemq.ActiveMQSourceConnector", "activemq.username": "rick", "activemq.password": "n3v3r_g0nn4_g1ve_y0u_up", […]
-
After:
'{"name": "source-activemq-01", "config": { "connector.class": "io.confluent.connect.activemq.ActiveMQSourceConnector", "activemq.username": "${file:/data/foo_credentials.properties:FOO_USERNAME}", "activemq.password": "${file:/data/foo_credentials.properties:FOO_PASSWORD}", […]
-
Kafka Connect worker secrets management 🔗
You can use the same approach to externalise sensitive values from the worker configuration file itself too. Whilst the worker masks sensitive values in the logfile, you still have the plaintext stored in the worker configuration file. Moving that to another file as shown below may not be so useful (unless the file is secured differently) but when combined with a configuration provider such as a password vault will be very useful.
-
As above, in the worker configuration, define the config provider. For a file provider it looks like this:
-
Properties file
config.providers=file config.providers.file.class=org.apache.kafka.common.config.provider.FileConfigProvider
-
Docker environment variables
CONNECT_CONFIG_PROVIDERS: 'file' CONNECT_CONFIG_PROVIDERS_FILE_CLASS: 'org.apache.kafka.common.config.provider.FileConfigProvider'
-
-
For the file provider, create a file with the key/value configuration items:
SSL_KEYSTORE_PASSWORD=nevergonnagiveyouup GROUP_ID=my_connect_group
-
In the worker configuration specify the configuration items you’d like to source from the configuration provider, just the same you would for a connector itself. For example, to override the group id and the SSL keystore password using the config specified in the sample file above:
-
Properties file
group.id=${file:/data/connect_external.properties:GROUP_ID} ssl.keystore.password=${file:/data/connect_external.properties:SSL_KEYSTORE_PASSWORD}
-
Docker environment variables
Note the double
$$
, since one it’s own will give you the error `Invalid interpolation format`CONNECT_GROUP_ID: $${file:/data/connect_external.properties:GROUP_ID} CONNECT_SSL_KEYSTORE_PASSWORD: $${file:/data/connect_external.properties:SSL_KEYSTORE_PASSWORD}
-
When the Kafka Connect worker launches you’ll see it uses the new values. Since the SSL credentials are already masked you just see that it’s a hidden value.
[2020-06-16 13:03:09,721] INFO DistributedConfig values: … group.id = my_connect_group ssl.keystore.password = [hidden]