最新消息:雨落星辰是一个专注网站SEO优化、网站SEO诊断、搜索引擎研究、网络营销推广、网站策划运营及站长类的自媒体原创博客

apache kafka - Debezium ad hoc snapshot not working because of no maximum key - Stack Overflow

programmeradmin1浏览0评论

I'm having problem trying ad hoc snapshot in debezium postgres source connector. I already created the signalling table and add the signal table to my debezium config.

When I tried to insert new row to trigger ad hoc snapshot. I got this from the log

2025-03-12 10:48:42,698 INFO [postgres-source-connector4|task-0] Requested 'INCREMENTAL' snapshot of data collections '[public.myTable]' with additional conditions '[]' and surrogate key 'mytable_id' (io.debezium.pipeline.signal.actions.snapshotting.ExecuteSnapshot) [debezium-postgresconnector-mytable-change-event-source-coordinator]

2025-03-12 10:48:42,720 INFO [postgres-source-connector4|task-0] No maximum key returned by the query, incremental snapshotting of table 'public.mytable' finished as it is empty (io.debezium.pipeline.source.snapshot.incremental.AbstractIncrementalSnapshotChangeEventSource) [debezium-postgresconnector-mytable-change-event-source-coordinator]

But my table is not empty at all. I did tried the cdc and it works.

My table has multiple primary key, so I tried by providing surrogate key to the payload when inserting the rows into the signal table. I also tried this on new table but still didnt work

Here is my insert query

INSERT INTO public.debezium_signal VALUES ('test2', 'execute-snapshot', '{"data-collections": ["public.testtable"]')

EDIT: it seems it is working for new table with only a few rows. And actually both the initial snapshot and ad hoc snapshot is not working for my older table with 1k+ rows, I suspect this is because the rows is too much or too big

发布评论

评论列表(0)

  1. 暂无评论