Source
Overview of the Kafka connector
Kafka Connect is a framework for connecting Kafka with external systems, including databases. A Kafka Connect cluster is a separate cluster from the Kafka cluster. The Kafka Connect cluster supports running and scaling out connectors…
Installing and configuring the Kafka connector
The Kafka connector is provided as a JAR (Java executable) file. Snowflake provides two versions of the connector: A version for the Confluent package version of Kafka. A version for the open source software (OSS) Apache Kafka package.
Troubleshooting the Kafka connector
The Kafka connector moves files it could not load to the stage associated with the target table. The syntax for referencing a table stage is @[namespace.]%table_name. List all files located in the table stage using LIST.
Managing the Kafka connector
The connector creates one named internal stage for each Kafka topic. The format of the stage name is: SNOWFLAKE_KAFKA_CONNECTOR_connector_name_STAGE_table_name Note that each internal stage stores not only files to be loaded into tables,…
Set up the Openflow Connector for Kafka
The Openflow Connector for Kafka is available in three different configurations, each optimized for specific use cases.
Monitoring the Kafka connector using Java Management Extensions (JMX)
Kafka Connect provides pre - configured JMX metrics that provides information about the Kafka connector. The Snowflake Connector for Kafka provides multiple Managed Beans (MBeans) that you can use to ingest metrics about the Kafka…
Schema detection and evolution for Kafka connector with Snowpipe Streaming classic
The Kafka connector with Snowpipe Streaming supports schema detection and evolution. The structure of tables in Snowflake can be defined and evolved automatically to support the structure of new Snowpipe streaming data loaded by the Kafka…
Troubleshooting & operating the Snowflake Kafka Connector distributed mode
The REST API is the interface to the Connect cluster. You can make requests to any Connect instance in the cluster and the REST API automatically shares the requests among the cluster if required.
Kafka connector: where to find missing data and table stage FAQ
The Snowflake Kafka connector uses the table stage to store any files that were not loaded successfully. Criteria for data to be moved to the table stage For every file uploaded by the connector, the load history is frequently checked to…
Snowflake High Performance connector for Kafka: Install and configure
The Kafka connector is provided as a JAR (Java executable) file. Snowflake provides two versions of the connector: A version for the Confluent Kafka installation. A version for the open source software (OSS) Apache Kafka…