Clickhouse and Kafka

After installation of clickhouse and kafka and creation of a topic called ‘test’ and sending ipfix data to this topic we are now ready to store this data permanently in a database, here I chose clickhouse.

How to get the data from kafka into clickhouse? We follow the instructions from here: https://altinity.com/blog/2020/5/21/clickhouse-kafka-engine-tutorial

We need three tables

  • A target MergeTree table to provide a home for ingested data
  • A Kafka engine table to make the topic look like a ClickHouse table
  • A materialized view to move data automatically from Kafka to the target table

Our kafka-json event looks like this:

"event_type":"purge",
"ip_src":"6.6.6.6",
"ip_dst":"1.1.1.1",
"port_src":56344,
"port_dst":443,
"tcp_flags":24,
"ip_proto":"tcp",
"tos":0,
"timestamp_start":"2021-01-19 3:55:10.026851",
"timestamp_end":"0000-00-00 0:00:00.000000",
"packets":1,
"bytes":667,
"writer_id":
"default_kafka/32086"
#create table 1 of 3
CREATE TABLE readings (
    readings_id Int32 Codec(DoubleDelta, LZ4),
    time DateTime Codec(DoubleDelta, LZ4),
    date ALIAS toDate(time),
    temperature Decimal(5,2) Codec(T64, LZ4)
) Engine = MergeTree
PARTITION BY toYYYYMM(time)
ORDER BY (readings_id, time

Was this helpful?

1 / 0

Cookie Consent with Real Cookie Banner