Name: confluent-kafka-dotnet
Owner: Confluent Inc.
Description: Confluent's Apache Kafka .NET client
Created: 2016-11-01 16:25:52.0
Updated: 2018-05-24 09:33:29.0
Pushed: 2018-05-23 13:45:33.0
Homepage: https://docs.confluent.io/current/clients/confluent-kafka-dotnet/api/Confluent.Kafka.html
Size: 1035
Language: C#
GitHub Committers
User | Most Recent Commit | # Commits |
---|
Other Committers
User | Most Recent Commit | # Commits |
---|
confluent-kafka-dotnet is Confluent's .NET client for Apache Kafka and the Confluent Platform.
Features:
High performance - confluent-kafka-dotnet is a lightweight wrapper around librdkafka, a finely tuned C client.
Reliability - There are a lot of details to get right when writing an Apache Kafka client. We get them right in one place (librdkafka) and leverage this work across all of our clients (also confluent-kafka-python and confluent-kafka-go).
Supported - Commercial support is offered by Confluent.
Future proof - Confluent, founded by the creators of Kafka, is building a streaming platform with Apache Kafka at its core. It's high priority for us that client features keep pace with core Apache Kafka and components of the Confluent Platform.
confluent-kafka-dotnet is derived from Andreas Heider's rdkafka-dotnet. We're fans of his work and were very happy to have been able to leverage rdkafka-dotnet as the basis of this client. Thanks Andreas!
confluent-kafka-dotnet is distributed via NuGet. We provide three packages:
To install Confluent.Kafka from within Visual Studio, search for Confluent.Kafka in the NuGet Package Manager UI, or run the following command in the Package Manager Console:
all-Package Confluent.Kafka -Version 0.11.4
To add a reference to a dotnet core project, execute the following at the command line:
et add package -v 0.11.4 Confluent.Kafka
We have started working towards a 1.0 release of the library which will occur after we add idempotence and transaction features. In order to best accomodate these and other changes,
we will be making breaking changes to the API in that release. You can track our progress on the 1.0-experimental
branch (as well as corresponding packages on
nuget.org). We have already added support for message headers and custom timestamps amongst other things. Note that all work on
this branch is subject to change and should not be considered production ready. All feedback is very welcome!
Also, nuget packages corresponding to all release branch commits are available from the following nuget package source (Note: this is not a web url - you should specify it in the nuget package manger): https://ci.appveyor.com/nuget/confluent-kafka-dotnet. The version suffix of these nuget packages matches the appveyor build number. You can see which commit a particular build number corresponds to by looking at the AppVeyor build history
Take a look in the examples directory for example usage. The integration tests also serve as good examples.
For an overview of configuration properties, refer to the librdkafka documentation.
API documentation is available on the Confluent website. Note that there is currently an issue with the build process that is preventing some of this documentation from being generated. For missing information, please refer instead to the XML doc comments in the source code.
g System;
g System.Text;
g System.Collections.Generic;
g Confluent.Kafka;
g Confluent.Kafka.Serialization;
ic class Program
blic static void Main()
var config = new Dictionary<string, object>
{
{ "bootstrap.servers", "localhost:9092" }
};
using (var producer = new Producer<Null, string>(config, null, new StringSerializer(Encoding.UTF8)))
{
var dr = producer.ProduceAsync("my-topic", null, "test message text").Result;
Console.WriteLine($"Delivered '{dr.Value}' to: {dr.TopicPartitionOffset}");
}
g System;
g System.Text;
g System.Collections.Generic;
g Confluent.Kafka;
g Confluent.Kafka.Serialization;
ic class Program
blic static void Main()
var conf = new Dictionary<string, object>
{
{ "group.id", "test-consumer-group" },
{ "bootstrap.servers", "localhost:9092" },
{ "auto.commit.interval.ms", 5000 },
{ "auto.offset.reset", "earliest" }
};
using (var consumer = new Consumer<Null, string>(conf, null, new StringDeserializer(Encoding.UTF8)))
{
consumer.OnMessage += (_, msg)
=> Console.WriteLine($"Read '{msg.Value}' from: {msg.TopicPartitionOffset}");
consumer.OnError += (_, error)
=> Console.WriteLine($"Error: {error}");
consumer.OnConsumeError += (_, msg)
=> Console.WriteLine($"Consume error ({msg.TopicPartitionOffset}): {msg.Error}");
consumer.Subscribe("my-topic");
while (true)
{
consumer.Poll(TimeSpan.FromMilliseconds(100));
}
}
The Avro serializer and deserializer provided by Confluent.Kafka.Avro
can be used with the GenericRecord
class
or with specific classes generated using the avrogen
tool
(available here). Usage:
et /path/to/avrogen.dll -s your_schema.asvc .
The Confluent Cloud example demonstrates how to configure the .NET client for use with Confluent Cloud.
To build the library or any test or example project, run the following from within the relevant project directory:
et restore
et build
To run an example project, run the following from within the example's project directory:
et run <args>
From within the test/Confluent.Kafka.UnitTests directory, run:
et test
From within the Confluent Platform (or Apache Kafka) distribution directory, run the following two commands (in separate terminal windows) to set up a single broker test Kafka cluster:
n/zookeeper-server-start ./etc/kafka/zookeeper.properties
n/kafka-server-start ./etc/kafka/server.properties
Now use the bootstrap-topics.sh
script in the test/Confleunt.Kafka.IntegrationTests directory to set up the
prerequisite topics:
otstrap-topics.sh <confluent platform path> <zookeeper>
then:
et test
Copyright (c) 2016-2017 Confluent Inc., 2015-2016, Andreas Heider