i'm deploying telegraf to kubernetes , the goal is to consume kafka messages from a kafka topic and send this message to zabbix as a persistent storage , the telegraf input plugin kafka consumer is working fine and messages are being comsumed but it's not appearing in zabbix here is my code :
apiVersion: v1
kind: ConfigMap
metadata:
name: telegraf-config
namespace: zabbix-dev-ns
data:
telegraf.conf: |
[agent]
debug = true
quiet = false
metric_buffer_limit = 1000000
hostname = "telegraf-kafka-host"
[[inputs.kafka_consumer]]
brokers = ["kafka:9093"]
topics = ["zabbix-auditlog-dev2"]
tls_ca = "/opt/sbd_root_ca.crt"
tls_cert = "/opt/auditlog-zabbix.crt"
tls_key = "/opt/auditlog-zabbix.key"
consumer_group = "zabbix-auditlog-dev2"
offset = "oldest"
data_format = "json"
json_string_fields = ["zabbix_instance","auditid","userid","action",
"resourcetype","details","ip","resourceid","resourcename"]
[[outputs.file]]
files = ["stdout"]
data_format = "json"
[[outputs.prometheus_client]]
listen = ":9273"
path = "/metrics"
[[outputs.zabbix]]
address = "zabbix-server-dev2-k-a.zabbix-dev-ns.svc.cluster.local:10051"
key_prefix = "telegraf."
lld_send_interval = "30s"
lld_clear_interval = "1h"
skip_measurement_prefix = true
agent_active = false
the logs looks like this and i created a host in zabbix with an item with key telegraf.fields but nothing showing here