Outdated Version

You are viewing an older version of this section. View current production version.

Securing Kafka Pipelines with TLS/SSL or Kerberos min read


You can connect a pipeline to a Kafka cluster through SSL and optionally authenticate through SASL. The following SASL authentication mechanisms are supported: * GSSAPI (Kerberos) * PLAIN * SCRAM-SHA-256 * SCRAM-SHA-512

This topic assumes you have already have set up, configured, and enabled SSL and/or Kerberos on your Kafka brokers. For information on how to enable this functionality, see the SSL and SASL sections in the Kafka documentation.

Warning

Using SSL and SASL with Kafka requires Kafka protocol version 0.9 or later; therefore, each pipeline using SSL and SASL with Kafka also needs to adhere to that version requirement. The Kafka protocol version can be passed in through JSON in the CREATE PIPELINE statement through the CONFIG clause, similarly to this CONFIG '{"kafka_version":"0.10.0.0"}'. Alternatively, the pipelines_kafka_version global variable controls this parameter for any pipeline without a Kafka version configuration value.

Like pipelines such as S3 or Azure, credentials are passed in through JSON in the CREATE PIPELINE statement. Any credentials used to encrypt or authenticate must be present on each node in the cluster.

The security.protocol credential specifies the encryption and authentication mechanisms used to connect to Kafka brokers. This property can be one of four values: plaintext, ssl, sasl_plaintext, or sasl_ssl. Depending on this value, you may have to supply additional credentials in your JSON.

Connect SingleStore Managed Service to Kafka using TLS/SSL

Applies to: SingleStore Managed Service

Use the following steps to enable TLS/SSL encryption between SingleStore Managed Service and Kafka.

  1. Create a table to use with the Kafka pipeline.

  2. From the Clusters page on the SingleStore Customer Portal, click on the cluster that you want to enable TLS/SSL connections.

  3. Click the Advanced tab at the top of the page.

  4. Click on the Upload Certificate button to upload your CA certificate. This will make it available to all the nodes and will allow you to secure outbound connections via TLS/SSL.

  5. Make a copy of the following SQL statement that will create the Kafka pipeline and replace the text in angle brackets with the values provided in the following steps.

    CREATE PIPELINE `<pipeline-name>`
    AS LOAD DATA KAFKA '<data-source>'
    CONFIG '{"security.protocol": "ssl",
    "ssl.ca.location": "<ca-certificate-file-path>"}'
    CREDENTIALS '{"ssl.key.password": "<your-password>"}'
    INTO table <table-name>;
    
  6. Replace <pipeline-name> with a name for your pipeline.

  7. Replace <data-source> with the Kafka data source.

  8. Replace <ca-certificate-file-path> with the full path to the SSL certificate on the SingleStore node.

  9. If your SSL certificate key is using a password, replace <your-password> with the SSL certificate key password.

  10. Replace <table-name> with the table you created earlier.

  11. The final SQL statement will resemble:

    CREATE PIPELINE `kafka_ssl`
    AS LOAD DATA KAFKA 'kafka-host:9093/test'
    CONFIG '{"security.protocol": "ssl",
    "ssl.ca.location": "/var/private/ssl/ca-cert.pem"}'
    CREDENTIALS '{"ssl.key.password": "your-ssl-key-password"}'
    INTO table your-table-name;
    
  12. Copy and paste the final SQL statement into the SQL Editor of SingleStore DB Studio and execute it. This will create a pipeline that connects SingleStore Managed Service to Kafka.

  13. In the SQL Editor, start your pipeline.

    START PIPELINE kafka_ssl;
    
  14. You may check the status of your pipeline in SingleStore DB Studio by clicking on Pipelines in the left sidebar.

End: Connect SingleStore Managed Service to Kafka using TLS/SSL

Connecting over SSL

To enable SSL encryption between Kafka and SingleStore DB, perform the following steps:

  1. Securely copy the CA certificate, SSL certificate, and SSL key used for connections between the cluster and Kafka brokers from the Kafka cluster to every SingleStore node. You should use a secure file transfer method, such as scp, to copy the files to your SingleStore nodes. The file locations on your SingleStore nodes should be consistent across the cluster.

  2. In your CONFIG JSON, if you want to enable SSL encryption only, set "security.protocol": "ssl". If you want to enable Kerberos with SSL, set "security.protocol": "sasl_ssl" and set the Kerberos credentials after you’ve completed step 3.

  3. Set the remaining SSL configuration in the CONFIG JSON:

    • ssl.ca.location: Path to the CA certificate on the SingleStore node.
    • ssl.certificate.location: Path to the SSL certificate on the SingleStore node.
    • ssl.key.location: Path to the SSL certificate key on the SingleStore node.
  4. If your SSL certificate key is using a password, set it in your CREDENDIALS JSON.

    • ssl.key.password: Password for SSL certificate key.

Authenticating with Kerberos

To configure a Kafka pipeline to authenticate with Kerberos, you must configure all of your nodes in your cluster as clients for Kerberos authentication and then set the credentials in your CREATE PIPELINE statement. To do this, perform the following steps:

  1. Securely copy the keytab file containing the SingleStore DB service principal (e.g. memsql/host.domain.com@REALM.NAME) from the Kerberos server to every node in your cluster. You should use a secure file transfer method, such as scp, to copy the keytab file to your SingleStore nodes. The file location on your SingleStore nodes should be consistent across the cluster.

  2. Make sure your SingleStore nodes can connect to the KDC server using the fully-qualified domain name (FQDN) of the KDC server. This might require configuring network settings or updating /etc/hosts on your nodes.

  3. Also ensure that the memsql service account on the node can access the copied keytab file. This can be accomplished by changing file ownership or permissions. If the memsql account cannot access the keytab file, you will not be able to complete the next step because your master aggregator will not be able to restart after applying configuration updates.

  4. When authenticating with Kerberos, SingleStore DB needs to authenticate as a client, which means you must also install a Kerberos client onto each node in your cluster. The following installs the krb5-user package for Debian-based Linux distributions.

    $ sudo apt-get update && apt-get install krb5-user
    

    When setting up your kerberos configuration settings, set your default realm, Kerberos admin server, and other options to those defined by your KDC server. In the examples used in this topic, the default realm is EXAMPLE.COM, and the Kerberos server settings are set to the FQDN of the KDC server host.example.com.

  5. In your CONFIG JSON, set "sasl.mechanism": "GSSAPI".

  6. Set "security.protocol": "sasl_ssl" for Kerberos and SSL connections, or "security.protocol": "sasl_plaintext" if you want to authenticate with Kerberos without SSL encryption. If you want to use Kerberos with SSL, make sure SSL is configured and enabled on the Kafka brokers and then add the SSL credential properties defined in the previous section.

  7. Set the remaining Kerberos configuration in CONFIG JSON:

    • sasl.kerberos.service.name: The Kerberos principal name that Kafka runs as. For example, "kafka".
    • sasl.kerberos.keytab: The local file path on the SingleStore node to the authenticating keytab.
    • sasl.kerberos.principal: The service principal name for the cluster. For example, "memsql/host.example.com@EXAMPLE.COM".

Authenticating with PLAIN or SCRAM SASL mechanism

To configure a Kafka pipeline to authenticate with other SASL mechanism, you must set the credentials in your CREATE PIPELINE statement. To do this, perform the following steps:

  1. In your CONFIG JSON, set "sasl.mechanism": "PLAIN". If your Kafka brokers uses SCRAM for authentication, then set "sasl.mechanism": "SCRAM-SHA-256" or "sasl.mechanism": "SCRAM-SHA-512".

  2. In your CONFIG JSON, set "security.protocol": "sasl_ssl" for and SSL connections, or "security.protocol": "sasl_plaintext" if you want to authenticate with Kafka without SSL encryption.

  3. In your CONFIG JSON, provide the username, "sasl.username": "<kafka_credential_username>".

  4. In your CREDENTIALS JSON, provide the password, "sasl.password": "<kafka_credential_password>".

SASL_PLAINTEXT/PLAIN Security

Please note that SASL_PLAINTEXT/PLAIN authentication mode with Kafka sends your credentials unencrypted over the network. It is therefore insecure and susceptible to being sniffed.

Also note that SASL_PLAINTEXT/SCRAM authentication mode with Kafka will encrypt the credentials information send over the network, but transport of Kafka messages themselves is still insecure.

Examples

The following examples make the following assumptions:

  • Port 9092 is a plaintext endpoint
  • Port 9093 is an SSL endpoint
  • Port 9094 is a plaintext SASL endpoint
  • Port 9095 is an SSL SASL endpoint

Plaintext

The following CREATE PIPELINE statements are equivalent:

CREATE PIPELINE `kafka_plaintext`
AS LOAD DATA KAFKA 'host.example.com:9092/test'
CONFIG '{"security.protocol": "plaintext"}'
INTO table t;
CREATE PIPELINE `kafka_no_creds`
AS LOAD DATA KAFKA 'host.example.com:9092/test'
INTO table t;

SSL

CREATE PIPELINE `kafka_ssl`
AS LOAD DATA KAFKA 'host.example.com:9093/test'
CONFIG '{"security.protocol": "ssl",
"ssl.certificate.location": "/var/private/ssl/client_memsql_client.pem",
"ssl.key.location": "/var/private/ssl/client_memsql_client.key",
"ssl.ca.location": "/var/private/ssl/ca-cert.pem"}'
CREDENTIALS '{"ssl.key.password": "abcdefgh"}'
INTO table t;

Kerberos with no SSL

CREATE PIPELINE `kafka_kerberos_no_ssl`
AS LOAD DATA KAFKA 'host.example.com:9094/test'
CONFIG '{""security.protocol": "sasl_plaintext",
"sasl.mechanism": "GSSAPI",
"sasl.kerberos.service.name": "kafka",
"sasl.kerberos.principal": "memsql/host.example.com@EXAMPLE.COM",
"sasl.kerberos.keytab": "/etc/krb5.keytab"}'
INTO table t;

Kerberos with SSL

CREATE PIPELINE `kafka_kerberos_ssl`
AS LOAD DATA KAFKA 'host.example.com:9095/test'
CONFIG '{"security.protocol": "sasl_ssl",
"sasl.mechanism": "GSSAPI",
"ssl.certificate.location": "/var/private/ssl/client_memsql_client.pem",
"ssl.key.location": "/var/private/ssl/client_memsql_client.key",
"ssl.ca.location": "/var/private/ssl/ca-cert.pem",
"sasl.kerberos.service.name": "kafka",
"sasl.kerberos.principal": "memsql/host.example.com@EXAMPLE.COM",
"sasl.kerberos.keytab": "/etc/krb5.keytab"}'
CREDENTIALS '{"ssl.key.password": "abcdefgh"}'
INTO table t;

SASL/PLAIN with SSL

CREATE PIPELINE `kafka_sasl_ssl_plain`
AS LOAD DATA KAFKA 'host.example.com:9095/test'
CONFIG '{"security.protocol": "sasl_ssl",
"sasl.mechanism": "PLAIN",
"ssl.certificate.location": "/var/private/ssl/client_memsql_client.pem",
"ssl.key.location": "/var/private/ssl/client_memsql_client.key",
"ssl.ca.location": "/var/private/ssl/ca-cert.pem",
"sasl.username": "kafka"}'
CREDENTIALS '{"ssl.key.password": "abcdefgh", "sasl.password": "metamorphosis"}'
INTO table t;

SASL/PLAIN without SSL

CREATE PIPELINE `kafka_sasl_plaintext_plain`
AS LOAD DATA KAFKA 'host.example.com:9094/test'
CONFIG '{"security.protocol": "sasl_plaintext",
"sasl.mechanism": "PLAIN",
"sasl.username": "kafka"}'
CREDENTIALS '{"sasl.password": "metamorphosis"}'
INTO table t;

SASL/SCRAM with SSL

CREATE PIPELINE `kafka_sasl_ssl_scram`
AS LOAD DATA KAFKA 'host.example.com:9095/test'
CONFIG '{"security.protocol": "sasl_ssl",
"sasl.mechanism": "SCRAM-SHA-512",
"ssl.certificate.location": "/var/private/ssl/client_memsql_client.pem",
"ssl.key.location": "/var/private/ssl/client_memsql_client.key",
"ssl.ca.location": "/var/private/ssl/ca-cert.pem",
"sasl.username": "kafka"}'
CREDENTIALS '{"ssl.key.password": "abcdefgh", "sasl.password": "metamorphosis"}'
INTO table t;

SASL/SCRAM without SSL

CREATE PIPELINE `kafka_sasl_plaintext_plain`
AS LOAD DATA KAFKA 'host.example.com:9094/test'
CONFIG '{"security.protocol": "sasl_plaintext",
"sasl.mechanism": "SCRAM-SHA-512",
"sasl.username": "kafka"}'
CREDENTIALS '{"sasl.password": "metamorphosis"}'
INTO table t;