Ingesting Kafka messages into Propel.
This guide covers how to:
First, you’ll need to create a user with the necessary permissions for Propel to connect to your Kafka cluster.
Create the user
Kafka doesn’t manage users directly; it relies on the underlying authentication system. So if you’re using SASL/PLAIN for authentication, you would add the user to the JAAS configuration file.
kafka_server_jaas.conf
) in a text editor.Replace YOUR_SUPER_SECURE_PASSWORD
with a secure password.
Set environment variable
Set the KAFKA_OPTS
environment variable to point to your JAAS config file:
Restart the Kafka server for the changes to take effect.
Grant permissions
Now, you’ll use Kafka’s Access Control Lists (ACLs) to grant permissions to the “propel” user.
Use the kafka-acls
CLI to add ACLs for the “propel” user so that it can operate on the propel-*
consumer groups.
For each topic you need to ingest to Propel, run the following command:
Make sure to replace localhost:2181
with your Zookeeper server.
These commands grant Describe
and Read
access to the topics for the user “propel”.
Verify the ACLs
Verify that the ACLs have been correctly set by listing the ACLs for the topics you authorized.
You should see the ACLs you added for the user “propel”.
Create the user
Kafka doesn’t manage users directly; it relies on the underlying authentication system. So if you’re using SASL/PLAIN for authentication, you would add the user to the JAAS configuration file.
kafka_server_jaas.conf
) in a text editor.Replace YOUR_SUPER_SECURE_PASSWORD
with a secure password.
Set environment variable
Set the KAFKA_OPTS
environment variable to point to your JAAS config file:
Restart the Kafka server for the changes to take effect.
Grant permissions
Now, you’ll use Kafka’s Access Control Lists (ACLs) to grant permissions to the “propel” user.
Use the kafka-acls
CLI to add ACLs for the “propel” user so that it can operate on the propel-*
consumer groups.
For each topic you need to ingest to Propel, run the following command:
Make sure to replace localhost:2181
with your Zookeeper server.
These commands grant Describe
and Read
access to the topics for the user “propel”.
Verify the ACLs
Verify that the ACLs have been correctly set by listing the ACLs for the topics you authorized.
You should see the ACLs you added for the user “propel”.
These instructions set up a API Key and secret with READ
and DESCRIBE
permissions in Confluent Cloud.
Create an API Key in the Confluent Cloud console
In Confluent Cloud, you generally use API keys for authentication rather than user/password combinations.
DESCRIBE
and READ
operations to the topic you need to ingest to Propel.DESCRIBE
, READ
and DELETE
operations to the consumer group propel-*
.Verify the permissions
After setting the permissions, you can verify them by clicking on the API key in the “API Keys” section and reviewing the roles and resources it has access to.
These instructions set up an IAM user with READ
and DESCRIBE
permissions in AWS MSK.
Create the IAM user and policy
Sign in to the AWS IAM Management Console.
In the navigation pane, choose “Users” and then choose “Create User”.
For “Username”, enter “propel” and click “Next”.
Select “Attach policies directly”, search and select the AmazonMSKReadOnlyAccess
policy, and click “Next”.
Then, create a custom policy that grants describe, read, and delete permissions on MSK consumer groups prefixed with “propel-”:
<region>
, <account-id>
, and <cluster-name>
with your actual AWS region, account ID, and MSK cluster name.Create the security credentials for the user
Enable authentication
Follow these steps to enable SASL/SCRAM for authentication with Redpanda.
Edit the Redpanda configuration file (usually located at /etc/redpanda/redpanda.yaml
).
Enable SASL/SCRAM by adding or updating the following configuration:
See Redpanda docs for enabling SASL/SCRAM.
SASL provides authentication, but not encryption. To enable SASL authentication with TLS encryption for the Kafka API, in redpanda.yaml, enter:
See Redpanda docs for enabling TLS encryption.
To check if SASL/SCRAM is enabled, run the following command:
You should see SCRAM
in the output.
See Redpanda docs for checking SASL/SCRAM.
Restart your Redpanda server to apply the changes.
Create the user
Redpanda uses rpk
, a command-line tool, to manage users and ACLs.
Create the user “propel”:
Replace <YOUR_SUPER_SECURE_PASSWORD>
with a secure password.
Grant permissions
Grant DESCRIBE
, READ
, and DELETE
permissions to the “propel” user for the topics you need to ingest.
Use the rpk acl
command to add ACLs for the “propel” user so that it can operate on the propel-*
consumer groups.
For each topic you need to ingest to Propel, run the following command:
These commands grant DESCRIBE
and READ
access to the topic “YOUR_TOPIC” for the user “propel”.
Verify the ACLs
Verify that the ACLs have been correctly set by listing the ACLs for the topic “YOUR_TOPIC”:
You should see the ACLs you added for the user “propel”.
To ensure that Propel can connect to your Kafka cluster, you need to authorize access from the following IP addresses:
Create a Kafka Data Pool
Go to the “Data Pools” section in the Console, click “Create Data Pool” and click on the “Kafka” tile.
If you create a Kafka Data Pool for the first time, you must create your Kafka credentials for Propel to connect to your Kafka servers.
Create your Kafka credentials
To create your Kafka credentials, you will need the following details:
Test your credentials
After entering your Kafka credentials, click “Create and test credentials” to ensure Propel can successfully connect to your Kafka cluster. If the connection is successful, you will see a confirmation message. If not, check your entered credentials and try again.
Introspect your Kafka topics
Here, you will see a list of topics available to ingest. If you don’t see the topic you want to ingest, make sure your user has the right permissions to access the topic.
Select the topic to ingest and timestamp
Here, you will see a list of topics available to ingest. Select the topic you want to ingest into this Data Pool. You will see the schema of the Data Pool.
Next, you need to select the timestamp column. This is the column that will be used to order the data in the Data Pool. By default, Propel selects the _timestamp
generated by Kafka.
Name your Data Pool and start ingesting
After you’ve selected the topic, provide a name for your Data Pool. This name will be used to identify the Data Pool in Propel. Once you’ve named your Data Pool, click “Create Data Pool”. Propel will then start ingesting data from the selected Kafka topics into your Data Pool.
Look at the data in your Data Pool
Once you’ve started ingesting data, you can view the data in your Data Pool. Go to the “Data Pools” section in the Console, click on your Kafka Data Pool, and click on the “Preview Data” tab. Here, you can see the data that has been ingested from your Kafka topic.
Create a Kafka Data Pool
Go to the “Data Pools” section in the Console, click “Create Data Pool” and click on the “Kafka” tile.
If you create a Kafka Data Pool for the first time, you must create your Kafka credentials for Propel to connect to your Kafka servers.
Create your Kafka credentials
To create your Kafka credentials, you will need the following details:
Test your credentials
After entering your Kafka credentials, click “Create and test credentials” to ensure Propel can successfully connect to your Kafka cluster. If the connection is successful, you will see a confirmation message. If not, check your entered credentials and try again.
Introspect your Kafka topics
Here, you will see a list of topics available to ingest. If you don’t see the topic you want to ingest, make sure your user has the right permissions to access the topic.
Select the topic to ingest and timestamp
Here, you will see a list of topics available to ingest. Select the topic you want to ingest into this Data Pool. You will see the schema of the Data Pool.
Next, you need to select the timestamp column. This is the column that will be used to order the data in the Data Pool. By default, Propel selects the _timestamp
generated by Kafka.
Name your Data Pool and start ingesting
After you’ve selected the topic, provide a name for your Data Pool. This name will be used to identify the Data Pool in Propel. Once you’ve named your Data Pool, click “Create Data Pool”. Propel will then start ingesting data from the selected Kafka topics into your Data Pool.
Look at the data in your Data Pool
Once you’ve started ingesting data, you can view the data in your Data Pool. Go to the “Data Pools” section in the Console, click on your Kafka Data Pool, and click on the “Preview Data” tab. Here, you can see the data that has been ingested from your Kafka topic.
First, you need to create a Data Source with your Kafka credentials.
To create the Data Pool, you need to:
id
of the Data Source to create the Data Pool replacing the <DATA_SOURCE_ID>
in the example below.table
field.Ingesting Kafka messages into Propel.
This guide covers how to:
First, you’ll need to create a user with the necessary permissions for Propel to connect to your Kafka cluster.
Create the user
Kafka doesn’t manage users directly; it relies on the underlying authentication system. So if you’re using SASL/PLAIN for authentication, you would add the user to the JAAS configuration file.
kafka_server_jaas.conf
) in a text editor.Replace YOUR_SUPER_SECURE_PASSWORD
with a secure password.
Set environment variable
Set the KAFKA_OPTS
environment variable to point to your JAAS config file:
Restart the Kafka server for the changes to take effect.
Grant permissions
Now, you’ll use Kafka’s Access Control Lists (ACLs) to grant permissions to the “propel” user.
Use the kafka-acls
CLI to add ACLs for the “propel” user so that it can operate on the propel-*
consumer groups.
For each topic you need to ingest to Propel, run the following command:
Make sure to replace localhost:2181
with your Zookeeper server.
These commands grant Describe
and Read
access to the topics for the user “propel”.
Verify the ACLs
Verify that the ACLs have been correctly set by listing the ACLs for the topics you authorized.
You should see the ACLs you added for the user “propel”.
Create the user
Kafka doesn’t manage users directly; it relies on the underlying authentication system. So if you’re using SASL/PLAIN for authentication, you would add the user to the JAAS configuration file.
kafka_server_jaas.conf
) in a text editor.Replace YOUR_SUPER_SECURE_PASSWORD
with a secure password.
Set environment variable
Set the KAFKA_OPTS
environment variable to point to your JAAS config file:
Restart the Kafka server for the changes to take effect.
Grant permissions
Now, you’ll use Kafka’s Access Control Lists (ACLs) to grant permissions to the “propel” user.
Use the kafka-acls
CLI to add ACLs for the “propel” user so that it can operate on the propel-*
consumer groups.
For each topic you need to ingest to Propel, run the following command:
Make sure to replace localhost:2181
with your Zookeeper server.
These commands grant Describe
and Read
access to the topics for the user “propel”.
Verify the ACLs
Verify that the ACLs have been correctly set by listing the ACLs for the topics you authorized.
You should see the ACLs you added for the user “propel”.
These instructions set up a API Key and secret with READ
and DESCRIBE
permissions in Confluent Cloud.
Create an API Key in the Confluent Cloud console
In Confluent Cloud, you generally use API keys for authentication rather than user/password combinations.
DESCRIBE
and READ
operations to the topic you need to ingest to Propel.DESCRIBE
, READ
and DELETE
operations to the consumer group propel-*
.Verify the permissions
After setting the permissions, you can verify them by clicking on the API key in the “API Keys” section and reviewing the roles and resources it has access to.
These instructions set up an IAM user with READ
and DESCRIBE
permissions in AWS MSK.
Create the IAM user and policy
Sign in to the AWS IAM Management Console.
In the navigation pane, choose “Users” and then choose “Create User”.
For “Username”, enter “propel” and click “Next”.
Select “Attach policies directly”, search and select the AmazonMSKReadOnlyAccess
policy, and click “Next”.
Then, create a custom policy that grants describe, read, and delete permissions on MSK consumer groups prefixed with “propel-”:
<region>
, <account-id>
, and <cluster-name>
with your actual AWS region, account ID, and MSK cluster name.Create the security credentials for the user
Enable authentication
Follow these steps to enable SASL/SCRAM for authentication with Redpanda.
Edit the Redpanda configuration file (usually located at /etc/redpanda/redpanda.yaml
).
Enable SASL/SCRAM by adding or updating the following configuration:
See Redpanda docs for enabling SASL/SCRAM.
SASL provides authentication, but not encryption. To enable SASL authentication with TLS encryption for the Kafka API, in redpanda.yaml, enter:
See Redpanda docs for enabling TLS encryption.
To check if SASL/SCRAM is enabled, run the following command:
You should see SCRAM
in the output.
See Redpanda docs for checking SASL/SCRAM.
Restart your Redpanda server to apply the changes.
Create the user
Redpanda uses rpk
, a command-line tool, to manage users and ACLs.
Create the user “propel”:
Replace <YOUR_SUPER_SECURE_PASSWORD>
with a secure password.
Grant permissions
Grant DESCRIBE
, READ
, and DELETE
permissions to the “propel” user for the topics you need to ingest.
Use the rpk acl
command to add ACLs for the “propel” user so that it can operate on the propel-*
consumer groups.
For each topic you need to ingest to Propel, run the following command:
These commands grant DESCRIBE
and READ
access to the topic “YOUR_TOPIC” for the user “propel”.
Verify the ACLs
Verify that the ACLs have been correctly set by listing the ACLs for the topic “YOUR_TOPIC”:
You should see the ACLs you added for the user “propel”.
To ensure that Propel can connect to your Kafka cluster, you need to authorize access from the following IP addresses:
Create a Kafka Data Pool
Go to the “Data Pools” section in the Console, click “Create Data Pool” and click on the “Kafka” tile.
If you create a Kafka Data Pool for the first time, you must create your Kafka credentials for Propel to connect to your Kafka servers.
Create your Kafka credentials
To create your Kafka credentials, you will need the following details:
Test your credentials
After entering your Kafka credentials, click “Create and test credentials” to ensure Propel can successfully connect to your Kafka cluster. If the connection is successful, you will see a confirmation message. If not, check your entered credentials and try again.
Introspect your Kafka topics
Here, you will see a list of topics available to ingest. If you don’t see the topic you want to ingest, make sure your user has the right permissions to access the topic.
Select the topic to ingest and timestamp
Here, you will see a list of topics available to ingest. Select the topic you want to ingest into this Data Pool. You will see the schema of the Data Pool.
Next, you need to select the timestamp column. This is the column that will be used to order the data in the Data Pool. By default, Propel selects the _timestamp
generated by Kafka.
Name your Data Pool and start ingesting
After you’ve selected the topic, provide a name for your Data Pool. This name will be used to identify the Data Pool in Propel. Once you’ve named your Data Pool, click “Create Data Pool”. Propel will then start ingesting data from the selected Kafka topics into your Data Pool.
Look at the data in your Data Pool
Once you’ve started ingesting data, you can view the data in your Data Pool. Go to the “Data Pools” section in the Console, click on your Kafka Data Pool, and click on the “Preview Data” tab. Here, you can see the data that has been ingested from your Kafka topic.
Create a Kafka Data Pool
Go to the “Data Pools” section in the Console, click “Create Data Pool” and click on the “Kafka” tile.
If you create a Kafka Data Pool for the first time, you must create your Kafka credentials for Propel to connect to your Kafka servers.
Create your Kafka credentials
To create your Kafka credentials, you will need the following details:
Test your credentials
After entering your Kafka credentials, click “Create and test credentials” to ensure Propel can successfully connect to your Kafka cluster. If the connection is successful, you will see a confirmation message. If not, check your entered credentials and try again.
Introspect your Kafka topics
Here, you will see a list of topics available to ingest. If you don’t see the topic you want to ingest, make sure your user has the right permissions to access the topic.
Select the topic to ingest and timestamp
Here, you will see a list of topics available to ingest. Select the topic you want to ingest into this Data Pool. You will see the schema of the Data Pool.
Next, you need to select the timestamp column. This is the column that will be used to order the data in the Data Pool. By default, Propel selects the _timestamp
generated by Kafka.
Name your Data Pool and start ingesting
After you’ve selected the topic, provide a name for your Data Pool. This name will be used to identify the Data Pool in Propel. Once you’ve named your Data Pool, click “Create Data Pool”. Propel will then start ingesting data from the selected Kafka topics into your Data Pool.
Look at the data in your Data Pool
Once you’ve started ingesting data, you can view the data in your Data Pool. Go to the “Data Pools” section in the Console, click on your Kafka Data Pool, and click on the “Preview Data” tab. Here, you can see the data that has been ingested from your Kafka topic.
First, you need to create a Data Source with your Kafka credentials.
To create the Data Pool, you need to:
id
of the Data Source to create the Data Pool replacing the <DATA_SOURCE_ID>
in the example below.table
field.