Using lakeFS with Apache Kafka¶
Apache Kafka provides a unified, high-throughput, low-latency platform for handling real-time data feeds.
Different distributions of Kafka offer different methods for exporting data to S3 called Kafka Sink Connectors.
The most commonly used Connector for S3 is Confluent's S3 Sink Connector.
Add the following to connector.properties file for lakeFS support:
```properties
Your lakeFS repository¶
s3.bucket.name=example-repo
Your lakeFS S3 endpoint and credentials¶
store.url=https://lakefs.example.com aws.access.key.id=AKIAIOSFODNN7EXAMPLE aws.secret.access.key=wJalrXUtnFEMI/K7MDENG/bPxRfiCYEXAMPLEKEY
main being the branch we want to write to¶
topics.dir=main/topics ```
Example
For usage examples, see the lakeFS Kafka sample repo.