Kafka Connect PostgreSQL Sink

Hello folks.

I am working on setting up a PostgreSQL sink with a JSON schema. So far everything has gone quite well. I do have a requirement that I haven’t been able to successfully implement. I have a couple of enums that are defined as a string in the Schema. They are passed into the sink as strings but they are defined as enums in PostgreSQL.

I have tackled this issue with the creation of a PostgreSQL trigger such as:

create or replace function “public”.“cast_events” () RETURNS trigger AS

BEGIN
NEW.event_type = NEW.event_type::event_types;
RETURN NEW;
END

LANGUAGE ‘plpgsql’;
CREATE OR REPLACE TRIGGER “call_cast_events”
BEFORE INSERT OR UPDATE ON kafka_event FOR EACH ROW
EXECUTE PROCEDURE “public”.cast_events();

When I insert a row through psql there are no issues.
But when I do it through Kafka Connect’s PostgreSQL it doesn’t seem to be executing the trigger at all.

Is there anything that would prevent the trigger to execute through Kafka Connect?

Any ideas or suggestions?

Hi @amontero

welcome :slight_smile:
one question to better understand your starting point.

you try to insert a row into a postgres with kafka connect and jdbc sink connector, correct?

best,
michael

Correct, I actually found a solution to this problem.

I added CREATE CAST (varchar AS My_Enum_Type).

No trigger was necessary. :grinning:

1 Like

This topic was automatically closed 7 days after the last reply. New replies are no longer allowed.