The context that this consumer requires to exeecute
A batch consumer function to be passed to KafkaConsumerService
Middleware for EachBatchConsumer
The decodeed type of the the kafka message value
The context that this consumer requires to exeecute
A consumer function to be passed to KafkaConsumerService
Middleware for EachMessageConsumer
The type of the the kafka message value, before its encoded
A producer function, that has topic and schema baked in Only needing the contents of the messages and the a KafkaProducerService object.
A function to be called with the schema registry that will pre-register all schemas you'll be producing.
Chunk a batch messages consume.
Each size
it will call the consumer function, then fire the heartbeat and commit the offsets.
You can use it like a middleware:
const inBatches = chunkBatchMiddleware({ size: 500 });
new KafkaConsumer(kafka, schemaRegistry, {
topic: '...',
eachBatch: inBatches({ size: 500 })(myBatchConsumer),
});
Convert a LoggerLike logger into a kafkajs logger.
A helper to create a producer function, with "baked in" topic and schema names
const mySend = produce({ topic: 'my-topic', schema: mySchema });
await mySend(producer, [{ value: { ... }, partition: 1 }]);
The type of the the kafka message value, before its encoded
A generic middleware to pass on the KafkaProducerService instance inside the function call.
Pre-register schemas in the schema registry. This will add / load them from the registry and cache them, so they can be used for producing records
Encode a the record value, using schemaRegistry and a given registry schema id.
Generated using TypeDoc
The decodeed type of the the kafka message value