Can't sink BigTable Connector. Unexpected error occurred with connector

Hi, I’m new for confluent platform and Kafka streaming. I would like to sink the topic to Bigtable in GCP. But, I can’t build the BigTable connector successfully. But, with the same topic, I can build the big query connector success.

BigTable Connector Error: Unexpected error occurred with connector. Confluent connect
team is looking at your failure. We will reach out if we need more details. Please check back here for an update.

BigTable Config
Topic Name: page_events_agg
Input Kafka record value format: JSON
Input Kafka record key format: JSON

Query when I create the topic

CREATE OR REPLACE TABLE  page_events_agg_table
WITH (KAFKA_TOPIC='page_events_agg_table',KEY_FORMAT='JSON', VALUE_FORMAT='JSON', PARTITIONS=1) 
AS
SELECT 
  SPLIT(DETAIL,'__')[2] ROW_KEY
, AS_VALUE(SPLIT(DETAIL,'__')[2]) PAGE
, sum(total_spending/1000/60) TOTAL_TIME_SPENDING_M
FROM customer_page_spending_stream
WINDOW TUMBLING ( SIZE 5 MINUTES ) 
GROUP BY SPLIT(detail,'__')[2]
EMIT CHANGES;

Additional: If I remove the key format when creating the topic. I got the Key Format Type with KAFKA. I can build the Bigtable connector success, but I got the message in DLQ instead. It shows this error.

"org.apache.kafka.connect.errors.DataException: Converting byte[] to Kafka Connect data failed due to serialization error

Can anyone help/suggest me to solve it. TT

Thank you so much for all help.