Apache Kafka® is the de facto standard in the data streaming world for sending
messages from multiple producers to multiple consumers, in a fast, reliable
and scalable manner.
Come and learn the basic concepts and how to use it, by modelling a fish and
chips shop!
Handling large numbers of events is an increasing challenge in our cloud
centric world. For instance, in the IoT (Internet of Things) industry, devices
are all busy announcing their current state, which we want to
manage and report on, and meanwhile we want to send firmware and other updates
*back* to specific groups of devices.
Traditional messaging solutions don't scale well for this type of problem. We
want to guarantee not to lose events, to handle high volumes in a timely
manner, and to be able to distribute message reception or production across
multiple consumers or producers (compare to sharding for database reads).
As it turns out, there is a good solution available: Apache Kafka® - it
provides all the capabilities we are looking for.
In this talk, rather than considering some imaginary IoT scenario, I'm going
to look at how one might use Kafka to model the events required to run a fish
and chip shop: ordering (plaice and chips for me, please), food preparation,
accounting and so on.
I'll demonstrate handling of multiple producers and consumers, automatic routing of
events as new consumers are added, persistence, which allows a new consumer to
start consuming events from the past, and more.