Prerequisites
To follow the steps on this page:- Create a target with the Real-time analytics capability enabled. You need your connection details.
This feature is currently not supported for on Microsoft Azure.
Access your Kafka cluster in Confluent Cloud
Take the following steps to prepare your Kafka cluster for connection:-
Create a service account
If you already have a service account, you can reuse it. To create a new service account:
- Log in to Confluent Cloud.
-
Click the burger menu at the top-right of the pane, then press
Access control>Service accounts>Add service account. -
Enter the following details:
- Name:
tigerdata-access - Description:
Service account for the Tiger Cloud source connector
- Name:
-
Add the service account owner role, then click
Next. -
Select a role assignment, then click
Add -
Click
Next, then clickCreate service account.
-
Create API keys
- In Confluent Cloud, click
Home>Environments> Select your environment > Select your cluster. - Under
Cluster overviewin the left sidebar, selectAPI Keys. - Click
Add key, chooseService Accountand clickNext. - Select
tigerdata-access, then clickNext. - For your cluster, choose the
Operationand select the followingPermissions, then clickNext:Resource type:ClusterOperation:DESCRIBEPermission:ALLOW
- Click
Download and continue, then securely store the ACL. - Use the same procedure to add the following keys:
- ACL 2: Topic access
Resource type:TopicTopic name: Select the topics that Tiger Cloud should readPattern type:LITERALOperation:READPermission:ALLOW
- ACL 3: Consumer group access
Resource type:Consumer groupConsumer group ID:tigerdata-kafka/<tiger_cloud_project_id>. See Find your connection details for where to find your IDPattern type:PREFIXEDOperation:READPermission:ALLOWYou need these to configure your Kafka source connector.
- ACL 2: Topic access
- In Confluent Cloud, click
Configure Confluent Cloud Schema Registry
The connector requires access to the Schema Registry to fetch schemas for Kafka topics. To configure the Schema Registry:-
Navigate to Schema Registry
In Confluent Cloud, click
Environmentsand select your environment, then clickStream Governance. -
Create a Schema Registry API key
-
Click
API Keys, then clickAdd API Key. -
Choose
Service Account, selecttigerdata-access, then clickNext. -
Under
Resource scope, chooseSchema Registry, select thedefaultenvironment, then clickNext. -
In
Create API Key, add the following, then clickCreate API Key:Name:tigerdata-schema-registry-accessDescription:API key for Tiger Cloud schema registry access
-
Click
Download API Keyand securely store the API key and secret, then clickComplete.
-
Click
-
Assign roles for Schema Registry
-
Click the burger menu at the top-right of the pane, then press
Access control>Accounts & access>Service accounts. -
Select the
tigerdata-accessservice account. -
In the
Accesstab, add the following role assignments forAll schema subjects:-
ResourceOwneron the service account. -
DeveloperReadon schema subjects. ChooseAll schema subjectsor restrict to specific subjects as required.
-
- Save the role assignments.
-
Click the burger menu at the top-right of the pane, then press
Add Kafka source connector
Take the following steps to create a Kafka source connector in .- In , select your
-
Go to
Connectors>Source connectors. ClickNew Connector, then selectKafka - Click the pencil icon, then set the connector name
-
Set up Kafka authentication
Enter the name of your cluster in Confluent Cloud and the information from the first
api-key-*.txtthat you downloaded, then clickAuthenticate. -
Set up the Schema Registry
Enter the service account ID and the information from the second
api-key-*.txtthat you downloaded, then clickAuthenticate. -
Select topics to sync
Add the schema and table, map the columns in the table, and click
Create connector.
Known limitations and unsupported types
The following Avro schema types are not supported:Union types
Multi-type non-nullable unions are blocked. Examples:-
Multiple type union:
-
Union as root schema:
Reference types (named type references)
Referencing a previously defined named type by name, instead of inline, is not supported. Examples:-
Named type definition:
-
Failing reference:
Unsupported logical types
Only the logical types in the hardcoded supported list are supported. This includes:- decimal, date, time-millis, time-micros
- timestamp-millis, timestamp-micros, timestamp-nanos
- local-timestamp-millis, local-timestamp-micros, local-timestamp-nanos
- uuid, duration