Unable to sync data from Snowflake to Starrocks using Airbyte Starrocks connector

I was trying the Airbyte Starrocks connector and running into error
the connector is able to create tables in Starrocks as seen in screenshot, but no data is loaded onto the table
also attaching the logs

2024-04-12 06:07:40 platform > Docker volume job log path: /tmp/workspace/9/0/logs.log
2024-04-12 06:07:40 platform > Executing worker wrapper. Airbyte version: 0.57.2
2024-04-12 06:07:40 platform > Attempt 0 to save workflow id for cancellation
2024-04-12 06:07:40 platform > Using default value for environment variable SIDECAR_KUBE_CPU_LIMIT: '2.0'
2024-04-12 06:07:40 platform > Using default value for environment variable SOCAT_KUBE_CPU_LIMIT: '2.0'
2024-04-12 06:07:40 platform > Using default value for environment variable SIDECAR_KUBE_CPU_REQUEST: '0.1'
2024-04-12 06:07:40 platform > Using default value for environment variable SOCAT_KUBE_CPU_REQUEST: '0.1'
2024-04-12 06:07:40 platform > 
2024-04-12 06:07:40 platform > ----- START CHECK -----
2024-04-12 06:07:40 platform > 
2024-04-12 06:07:40 platform > Checking if airbyte/source-snowflake:0.3.1 exists...
2024-04-12 06:07:40 platform > airbyte/source-snowflake:0.3.1 was found locally.
2024-04-12 06:07:40 platform > Creating docker container = source-snowflake-check-9-0-vlefx with resources io.airbyte.config.ResourceRequirements@4bdf6c19[cpuRequest=,cpuLimit=,memoryRequest=,memoryLimit=,additionalProperties={}] and allowedHosts io.airbyte.config.AllowedHosts@167b4993[hosts=[am62076.ap-southeast-2.snowflakecomputing.com, *.datadoghq.com, *.datadoghq.eu, *.sentry.io],additionalProperties={}]
2024-04-12 06:07:40 platform > Preparing command: docker run --rm --init -i -w /data/9/0 --log-driver none --name source-snowflake-check-9-0-vlefx --network host -v airbyte_workspace:/data -v oss_local_root:/local -e DEPLOYMENT_MODE=OSS -e WORKER_CONNECTOR_IMAGE=airbyte/source-snowflake:0.3.1 -e AUTO_DETECT_SCHEMA=true -e LAUNCHDARKLY_KEY= -e SOCAT_KUBE_CPU_REQUEST=0.1 -e SOCAT_KUBE_CPU_LIMIT=2.0 -e FIELD_SELECTION_WORKSPACES= -e USE_STREAM_CAPABLE_STATE=true -e AIRBYTE_ROLE=dev -e WORKER_ENVIRONMENT=DOCKER -e APPLY_FIELD_SELECTION=false -e WORKER_JOB_ATTEMPT=0 -e OTEL_COLLECTOR_ENDPOINT=http://host.docker.internal:4317 -e FEATURE_FLAG_CLIENT=config -e AIRBYTE_VERSION=0.57.2 -e WORKER_JOB_ID=9 airbyte/source-snowflake:0.3.1 check --config source_config.json
2024-04-12 06:07:40 platform > Reading messages from protocol version 0.2.0
2024-04-12 06:07:41 platform > 2024-04-12 06:07:41 INFO i.a.c.i.b.a.AdaptiveSourceRunner$Runner(getSource):74 - Running source under deployment mode: OSS
2024-04-12 06:07:41 platform > 2024-04-12 06:07:41 INFO i.a.c.i.b.a.AdaptiveSourceRunner$Runner(run):86 - Starting source: io.airbyte.integrations.source.snowflake.SnowflakeSource
2024-04-12 06:07:41 platform > 2024-04-12 06:07:41 INFO i.a.c.i.b.IntegrationCliParser(parseOptions):126 - integration args: {check=null, config=source_config.json}
2024-04-12 06:07:41 platform > 2024-04-12 06:07:41 INFO i.a.c.i.b.IntegrationRunner(runInternal):132 - Running integration: io.airbyte.integrations.source.snowflake.SnowflakeSource
2024-04-12 06:07:41 platform > 2024-04-12 06:07:41 INFO i.a.c.i.b.IntegrationRunner(runInternal):133 - Command: CHECK
2024-04-12 06:07:41 platform > 2024-04-12 06:07:41 INFO i.a.c.i.b.IntegrationRunner(runInternal):134 - Integration config: IntegrationConfig{command=CHECK, configPath='source_config.json', catalogPath='null', statePath='null'}
2024-04-12 06:07:42 platform > 2024-04-12 06:07:42 WARN c.n.s.JsonMetaSchema(newValidator):278 - Unknown keyword order - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword
2024-04-12 06:07:42 platform > 2024-04-12 06:07:42 WARN c.n.s.JsonMetaSchema(newValidator):278 - Unknown keyword airbyte_secret - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword
2024-04-12 06:07:42 platform > 2024-04-12 06:07:42 INFO i.a.i.s.s.SnowflakeSource(buildUsernamePasswordConfig):121 - jdbc:snowflake://<>/?role=<>&warehouse=<>&database=<>&JDBC_QUERY_RESULT_FORMAT=JSON&CLIENT_SESSION_KEEP_ALIVE=true&application=airbyte_oss&schema=AIRCALL&CLIENT_METADATA_REQUEST_USE_CONNECTION_CTX=true
2024-04-12 06:07:42 platform > 2024-04-12 06:07:42 INFO i.a.i.s.s.SnowflakeDataSourceUtils(createDataSource):77 - Authorization mode is 'Username and password'
2024-04-12 06:07:42 platform > 2024-04-12 06:07:42 INFO c.z.h.HikariDataSource(getConnection):109 - HikariPool-1 - Starting...
2024-04-12 06:07:44 platform > 2024-04-12 06:07:44 INFO c.z.h.p.HikariPool(checkFailFast):554 - HikariPool-1 - Added connection net.snowflake.client.jdbc.SnowflakeConnectionV1@3e67f5f2
2024-04-12 06:07:44 platform > 2024-04-12 06:07:44 INFO c.z.h.HikariDataSource(getConnection):122 - HikariPool-1 - Start completed.
2024-04-12 06:07:44 platform > 2024-04-12 06:07:44 INFO i.a.c.i.s.j.AbstractJdbcSource(lambda$getCheckOperations$1):140 - Attempting to get metadata from the database to see if we can connect.
2024-04-12 06:07:45 platform > 2024-04-12 06:07:45 INFO c.z.h.HikariDataSource(close):349 - HikariPool-1 - Shutdown initiated...
2024-04-12 06:07:45 platform > 2024-04-12 06:07:45 INFO c.z.h.HikariDataSource(close):351 - HikariPool-1 - Shutdown completed.
2024-04-12 06:07:45 platform > 2024-04-12 06:07:45 INFO i.a.c.i.b.IntegrationRunner(runInternal):231 - Completed integration: io.airbyte.integrations.source.snowflake.SnowflakeSource
2024-04-12 06:07:45 platform > 2024-04-12 06:07:45 INFO i.a.c.i.b.a.AdaptiveSourceRunner$Runner(run):88 - Completed source: io.airbyte.integrations.source.snowflake.SnowflakeSource
2024-04-12 06:07:47 platform > Check connection job received output: io.airbyte.config.StandardCheckConnectionOutput@4b9ac469[status=succeeded,message=<null>,additionalProperties={}]
2024-04-12 06:07:47 platform > 
2024-04-12 06:07:47 platform > ----- END CHECK -----
2024-04-12 06:07:47 platform > 
2024-04-12 06:07:47 platform > Docker volume job log path: /tmp/workspace/9/0/logs.log
2024-04-12 06:07:47 platform > Executing worker wrapper. Airbyte version: 0.57.2
2024-04-12 06:07:47 platform > Attempt 0 to save workflow id for cancellation
2024-04-12 06:07:47 platform > Using default value for environment variable SIDECAR_KUBE_CPU_LIMIT: '2.0'
2024-04-12 06:07:47 platform > 
2024-04-12 06:07:47 platform > Using default value for environment variable SOCAT_KUBE_CPU_LIMIT: '2.0'
2024-04-12 06:07:47 platform > Using default value for environment variable SIDECAR_KUBE_CPU_REQUEST: '0.1'
2024-04-12 06:07:47 platform > Using default value for environment variable SOCAT_KUBE_CPU_REQUEST: '0.1'
2024-04-12 06:07:47 platform > ----- START CHECK -----
2024-04-12 06:07:47 platform > 
2024-04-12 06:07:47 platform > Checking if starrocks/destination-starrocks:latest exists...
2024-04-12 06:07:47 platform > starrocks/destination-starrocks:latest was found locally.
2024-04-12 06:07:47 platform > Creating docker container = destination-starrocks-check-9-0-yrgrv with resources io.airbyte.config.ResourceRequirements@6760d031[cpuRequest=,cpuLimit=,memoryRequest=,memoryLimit=,additionalProperties={}] and allowedHosts null
2024-04-12 06:07:47 platform > Preparing command: docker run --rm --init -i -w /data/9/0 --log-driver none --name destination-starrocks-check-9-0-yrgrv --network host -v airbyte_workspace:/data -v oss_local_root:/local -e DEPLOYMENT_MODE=OSS -e WORKER_CONNECTOR_IMAGE=starrocks/destination-starrocks:latest -e AUTO_DETECT_SCHEMA=true -e LAUNCHDARKLY_KEY= -e SOCAT_KUBE_CPU_REQUEST=0.1 -e SOCAT_KUBE_CPU_LIMIT=2.0 -e FIELD_SELECTION_WORKSPACES= -e USE_STREAM_CAPABLE_STATE=true -e AIRBYTE_ROLE=dev -e WORKER_ENVIRONMENT=DOCKER -e APPLY_FIELD_SELECTION=false -e WORKER_JOB_ATTEMPT=0 -e OTEL_COLLECTOR_ENDPOINT=http://host.docker.internal:4317 -e FEATURE_FLAG_CLIENT=config -e AIRBYTE_VERSION=0.57.2 -e WORKER_JOB_ID=9 starrocks/destination-starrocks:latest check --config source_config.json
2024-04-12 06:07:47 platform > Reading messages from protocol version 0.2.0
2024-04-12 06:07:48 platform > INFO i.a.i.b.IntegrationCliParser(parseOptions):126 integration args: {check=null, config=source_config.json}
2024-04-12 06:07:48 platform > INFO i.a.i.b.IntegrationRunner(runInternal):130 Running integration: io.airbyte.integrations.destination.starrocks.StarRocksDestination
2024-04-12 06:07:48 platform > INFO i.a.i.b.IntegrationRunner(runInternal):131 Command: CHECK
2024-04-12 06:07:48 platform > INFO i.a.i.b.IntegrationRunner(runInternal):132 Integration config: IntegrationConfig{command=CHECK, configPath='source_config.json', catalogPath='null', statePath='null'}
2024-04-12 06:07:48 platform > WARN c.n.s.JsonMetaSchema(newValidator):278 Unknown keyword order - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword
2024-04-12 06:07:48 platform > WARN c.n.s.JsonMetaSchema(newValidator):278 Unknown keyword airbyte_secret - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword
2024-04-12 06:07:48 platform > INFO i.a.i.d.s.SqlUtil(createJDBCConnection):37 dbURL: jdbc:mysql://10.134.145.8:9030/airbyte?rewriteBatchedStatements=true&useUnicode=true&characterEncoding=utf8
2024-04-12 06:07:48 platform > INFO i.a.i.d.s.SqlUtil(createJDBCConnection):42 dbURL-root: jdbc:mysql://10.134.145.8:9030/?rewriteBatchedStatements=true&useUnicode=true&characterEncoding=utf8
2024-04-12 06:07:49 platform > INFO i.a.i.d.s.StarRocksDestination(check):75 HTTP/HTTPS test connectivity : http://10.134.145.8:8030
2024-04-12 06:07:49 platform > INFO i.a.i.d.s.StarRocksDestination(check):82 HTTP Response Code : 401
2024-04-12 06:07:49 platform > INFO i.a.i.b.IntegrationRunner(runInternal):226 Completed integration: io.airbyte.integrations.destination.starrocks.StarRocksDestination
2024-04-12 06:07:49 platform > Check connection job received output: io.airbyte.config.StandardCheckConnectionOutput@6f169c20[status=succeeded,message=<null>,additionalProperties={}]
2024-04-12 06:07:49 platform > 
2024-04-12 06:07:49 platform > ----- END CHECK -----
2024-04-12 06:07:49 platform > 
2024-04-12 06:07:50 platform > Docker volume job log path: /tmp/workspace/9/0/logs.log
2024-04-12 06:07:50 platform > Executing worker wrapper. Airbyte version: 0.57.2
2024-04-12 06:07:50 platform > Attempt 0 to save workflow id for cancellation
2024-04-12 06:07:50 platform > start sync worker. job id: 9 attempt id: 0
2024-04-12 06:07:50 platform > 
2024-04-12 06:07:50 platform > ----- START REPLICATION -----
2024-04-12 06:07:50 platform > 
2024-04-12 06:07:50 platform > Running destination...
2024-04-12 06:07:50 platform > Using default value for environment variable SIDECAR_KUBE_CPU_LIMIT: '2.0'
2024-04-12 06:07:50 platform > Using default value for environment variable SOCAT_KUBE_CPU_LIMIT: '2.0'
2024-04-12 06:07:50 platform > Using default value for environment variable SIDECAR_KUBE_CPU_REQUEST: '0.1'
2024-04-12 06:07:50 platform > Using default value for environment variable SOCAT_KUBE_CPU_REQUEST: '0.1'
2024-04-12 06:07:50 platform > Using default value for environment variable SIDECAR_KUBE_CPU_LIMIT: '2.0'
2024-04-12 06:07:50 platform > Using default value for environment variable SOCAT_KUBE_CPU_LIMIT: '2.0'
2024-04-12 06:07:50 platform > Using default value for environment variable SIDECAR_KUBE_CPU_REQUEST: '0.1'
2024-04-12 06:07:50 platform > Using default value for environment variable SOCAT_KUBE_CPU_REQUEST: '0.1'
2024-04-12 06:07:50 platform > Checking if starrocks/destination-starrocks:latest exists...
2024-04-12 06:07:50 platform > Checking if airbyte/source-snowflake:0.3.1 exists...
2024-04-12 06:07:50 platform > airbyte/source-snowflake:0.3.1 was found locally.
2024-04-12 06:07:50 platform > starrocks/destination-starrocks:latest was found locally.
2024-04-12 06:07:50 platform > Creating docker container = source-snowflake-read-9-0-ujjdp with resources io.airbyte.config.ResourceRequirements@4294496c[cpuRequest=1,cpuLimit=2,memoryRequest=1Gi,memoryLimit=2Gi,additionalProperties={}] and allowedHosts io.airbyte.config.AllowedHosts@509799d9[hosts=[am34576.ap-southeast-2.snowflakecomputing.com, *.datadoghq.com, *.datadoghq.eu, *.sentry.io],additionalProperties={}]
2024-04-12 06:07:50 platform > Creating docker container = destination-starrocks-write-9-0-enodm with resources io.airbyte.config.ResourceRequirements@7fdde436[cpuRequest=1,cpuLimit=2,memoryRequest=1Gi,memoryLimit=2Gi,additionalProperties={}] and allowedHosts null
2024-04-12 06:07:50 platform > Preparing command: docker run --rm --init -i -w /data/9/0 --log-driver none --name source-snowflake-read-9-0-ujjdp -e CONCURRENT_SOURCE_STREAM_READ=false --network host -v airbyte_workspace:/data -v oss_local_root:/local -e DEPLOYMENT_MODE=OSS -e WORKER_CONNECTOR_IMAGE=airbyte/source-snowflake:0.3.1 -e AUTO_DETECT_SCHEMA=true -e LAUNCHDARKLY_KEY= -e SOCAT_KUBE_CPU_REQUEST=0.1 -e SOCAT_KUBE_CPU_LIMIT=2.0 -e FIELD_SELECTION_WORKSPACES= -e USE_STREAM_CAPABLE_STATE=true -e AIRBYTE_ROLE=dev -e WORKER_ENVIRONMENT=DOCKER -e APPLY_FIELD_SELECTION=false -e WORKER_JOB_ATTEMPT=0 -e OTEL_COLLECTOR_ENDPOINT=http://host.docker.internal:4317 -e FEATURE_FLAG_CLIENT=config -e AIRBYTE_VERSION=0.57.2 -e WORKER_JOB_ID=9 --cpus=2 --memory-reservation=1Gi --memory=2Gi airbyte/source-snowflake:0.3.1 read --config source_config.json --catalog source_catalog.json
2024-04-12 06:07:50 platform > Preparing command: docker run --rm --init -i -w /data/9/0 --log-driver none --name destination-starrocks-write-9-0-enodm --network host -v airbyte_workspace:/data -v oss_local_root:/local -e DEPLOYMENT_MODE=OSS -e WORKER_CONNECTOR_IMAGE=starrocks/destination-starrocks:latest -e AUTO_DETECT_SCHEMA=true -e LAUNCHDARKLY_KEY= -e SOCAT_KUBE_CPU_REQUEST=0.1 -e SOCAT_KUBE_CPU_LIMIT=2.0 -e FIELD_SELECTION_WORKSPACES= -e USE_STREAM_CAPABLE_STATE=true -e AIRBYTE_ROLE=dev -e WORKER_ENVIRONMENT=DOCKER -e APPLY_FIELD_SELECTION=false -e WORKER_JOB_ATTEMPT=0 -e OTEL_COLLECTOR_ENDPOINT=http://host.docker.internal:4317 -e FEATURE_FLAG_CLIENT=config -e AIRBYTE_VERSION=0.57.2 -e WORKER_JOB_ID=9 --cpus=2 --memory-reservation=1Gi --memory=2Gi starrocks/destination-starrocks:latest write --config destination_config.json --catalog destination_catalog.json
2024-04-12 06:07:50 platform > Reading messages from protocol version 0.2.0
2024-04-12 06:07:50 platform > Writing messages to protocol version 0.2.0
2024-04-12 06:07:50 platform > Reading messages from protocol version 0.2.0
2024-04-12 06:07:50 platform > readFromSource: start
2024-04-12 06:07:50 platform > Starting source heartbeat check. Will check threshold of 10800 seconds, every 1 minutes.
2024-04-12 06:07:50 platform > processMessage: start
2024-04-12 06:07:50 platform > writeToDestination: start
2024-04-12 06:07:50 platform > readFromDestination: start
2024-04-12 06:07:50 destination > WARNING: Your kernel does not support swap limit capabilities or the cgroup is not mounted. Memory limited without swap.
2024-04-12 06:07:50 source > WARNING: Your kernel does not support swap limit capabilities or the cgroup is not mounted. Memory limited without swap.
2024-04-12 06:07:53 source > 2024-04-12 06:07:53 INFO i.a.c.i.b.a.AdaptiveSourceRunner$Runner(getSource):74 - Running source under deployment mode: OSS
2024-04-12 06:07:53 source > 2024-04-12 06:07:53 INFO i.a.c.i.b.a.AdaptiveSourceRunner$Runner(run):86 - Starting source: io.airbyte.integrations.source.snowflake.SnowflakeSource
2024-04-12 06:07:54 destination > INFO i.a.i.b.IntegrationCliParser(parseOptions):126 integration args: {catalog=destination_catalog.json, write=null, config=destination_config.json}
2024-04-12 06:07:54 destination > INFO i.a.i.b.IntegrationRunner(runInternal):130 Running integration: io.airbyte.integrations.destination.starrocks.StarRocksDestination
2024-04-12 06:07:54 destination > INFO i.a.i.b.IntegrationRunner(runInternal):131 Command: WRITE
2024-04-12 06:07:54 destination > INFO i.a.i.b.IntegrationRunner(runInternal):132 Integration config: IntegrationConfig{command=WRITE, configPath='destination_config.json', catalogPath='destination_catalog.json', statePath='null'}
2024-04-12 06:07:54 source > 2024-04-12 06:07:54 INFO i.a.c.i.b.IntegrationCliParser(parseOptions):126 - integration args: {read=null, catalog=source_catalog.json, config=source_config.json}
2024-04-12 06:07:54 source > 2024-04-12 06:07:54 INFO i.a.c.i.b.IntegrationRunner(runInternal):132 - Running integration: io.airbyte.integrations.source.snowflake.SnowflakeSource
2024-04-12 06:07:54 source > 2024-04-12 06:07:54 INFO i.a.c.i.b.IntegrationRunner(runInternal):133 - Command: READ
2024-04-12 06:07:54 source > 2024-04-12 06:07:54 INFO i.a.c.i.b.IntegrationRunner(runInternal):134 - Integration config: IntegrationConfig{command=READ, configPath='source_config.json', catalogPath='source_catalog.json', statePath='null'}
2024-04-12 06:07:55 destination > WARN c.n.s.JsonMetaSchema(newValidator):278 Unknown keyword order - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword
2024-04-12 06:07:55 destination > WARN c.n.s.JsonMetaSchema(newValidator):278 Unknown keyword airbyte_secret - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword
2024-04-12 06:07:55 source > 2024-04-12 06:07:55 WARN c.n.s.JsonMetaSchema(newValidator):278 - Unknown keyword order - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword
2024-04-12 06:07:55 source > 2024-04-12 06:07:55 WARN c.n.s.JsonMetaSchema(newValidator):278 - Unknown keyword airbyte_secret - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword
2024-04-12 06:07:55 destination > INFO i.a.i.d.s.StarRocksDestination(getConsumer):101 JsonNode config: 
{
  "ssl" : false,
  "host" : "10.134.145.8",
  "database" : "airbyte",
  "password":"**********",
  "username" : "root",
  "http_port" : 8030,
  "query_port" : 9030
}
2024-04-12 06:07:55 destination > INFO i.a.i.d.s.SqlUtil(createJDBCConnection):37 dbURL: jdbc:mysql://10.134.145.8:9030/airbyte?rewriteBatchedStatements=true&useUnicode=true&characterEncoding=utf8
2024-04-12 06:07:55 destination > INFO i.a.i.d.s.SqlUtil(createJDBCConnection):42 dbURL-root: jdbc:mysql://10.134.145.8:9030/?rewriteBatchedStatements=true&useUnicode=true&characterEncoding=utf8
2024-04-12 06:07:55 source > 2024-04-12 06:07:55 INFO i.a.c.i.s.r.s.StateManagerFactory(createStateManager):57 - Stream state manager selected to manage state object with type STREAM.
2024-04-12 06:07:55 source > 2024-04-12 06:07:55 INFO i.a.c.i.s.r.s.CursorManager(createCursorInfoForStream):192 - No cursor field set in catalog but not present in state. Stream: AIRCALL_TEAM, New Cursor Field: null. Resetting cursor value
2024-04-12 06:07:55 destination > INFO i.a.i.d.b.BufferedStreamConsumer(startTracked):173 class io.airbyte.integrations.destination.buffered_stream_consumer.BufferedStreamConsumer started.
2024-04-12 06:07:55 destination > INFO i.a.i.d.s.StarRocksBufferedConsumerFactory(lambda$onStartFunction$0):69 Preparing tmp tables in destination started for 1 streams
2024-04-12 06:07:55 destination > INFO i.a.i.d.s.SqlUtil(createDatabaseIfNotExist):74 SQL: CREATE DATABASE IF NOT EXISTS airbyte;
2024-04-12 06:07:55 destination > INFO i.a.i.d.s.StarRocksBufferedConsumerFactory(lambda$onStartFunction$0):75 Preparing tmp table in destination started for stream TEAM. tmp table name: _airbyte_tmp_izn_TEAM
2024-04-12 06:07:55 destination > INFO i.a.i.d.s.SqlUtil(createTableIfNotExist):107 SQL: CREATE TABLE IF NOT EXISTS _airbyte_tmp_izn_TEAM ( 
`_airbyte_emitted_at` BIGINT,
`_airbyte_ab_id` varchar(40),
`_airbyte_data` String)
DUPLICATE KEY(`_airbyte_emitted_at`,`_airbyte_ab_id`) 
DISTRIBUTED BY HASH(`_airbyte_ab_id`) BUCKETS 16 
PROPERTIES ( 
"replication_num" = "1" 
);
2024-04-12 06:07:55 destination > INFO i.a.i.d.s.SqlUtil(truncateTable):80 SQL: TRUNCATE TABLE _airbyte_tmp_izn_TEAM;
2024-04-12 06:07:56 destination > INFO i.a.i.d.s.StarRocksBufferedConsumerFactory(lambda$onStartFunction$0):82 Preparing tmp tables in destination completed.
2024-04-12 06:07:56 source > 2024-04-12 06:07:56 INFO i.a.i.s.s.SnowflakeSource(buildUsernamePasswordConfig):121 - jdbc:snowflake://<>/?role=<>&warehouse=<>&database=<>&JDBC_QUERY_RESULT_FORMAT=JSON&CLIENT_SESSION_KEEP_ALIVE=true&application=airbyte_oss&schema=AIRCALL&CLIENT_METADATA_REQUEST_USE_CONNECTION_CTX=true
2024-04-12 06:07:56 source > 2024-04-12 06:07:56 INFO i.a.i.s.s.SnowflakeDataSourceUtils(createDataSource):77 - Authorization mode is 'Username and password'
2024-04-12 06:07:56 source > 2024-04-12 06:07:56 INFO c.z.h.HikariDataSource(getConnection):109 - HikariPool-1 - Starting...
2024-04-12 06:07:59 source > 2024-04-12 06:07:59 INFO c.z.h.p.HikariPool(checkFailFast):554 - HikariPool-1 - Added connection net.snowflake.client.jdbc.SnowflakeConnectionV1@26c89563
2024-04-12 06:07:59 source > 2024-04-12 06:07:59 INFO c.z.h.HikariDataSource(getConnection):122 - HikariPool-1 - Start completed.
2024-04-12 06:07:59 source > 2024-04-12 06:07:59 INFO i.a.c.i.s.j.AbstractJdbcSource(logPreSyncDebugData):467 - Data source product recognized as Snowflake:8.13.3
2024-04-12 06:07:59 source > 2024-04-12 06:07:59 INFO i.a.c.i.s.j.AbstractJdbcSource(discoverInternal):169 - Internal schemas to exclude: [INFORMATION_SCHEMA]
2024-04-12 06:08:00 source > 2024-04-12 06:08:00 INFO i.a.c.i.s.j.AbstractJdbcSource(queryTableFullRefresh):113 - Queueing query for table: TEAM
2024-04-12 06:08:00 source > 2024-04-12 06:08:00 INFO i.a.c.i.s.r.RelationalDbQueryUtils(lambda$queryTable$0):77 - Queueing query: SELECT "ID","NAME","_FIVETRAN_DELETED","CREATED_AT","_FIVETRAN_SYNCED" FROM "AIRCALL"."TEAM"
2024-04-12 06:08:01 source > 2024-04-12 06:08:01 INFO i.a.c.d.j.s.AdaptiveStreamingQueryConfig(initialize):31 - Set initial fetch size: 10 rows
2024-04-12 06:08:01 source > 2024-04-12 06:08:01 INFO i.a.c.d.j.s.AdaptiveStreamingQueryConfig(accept):40 - Set new fetch size: 536082 rows
2024-04-12 06:08:01 source > 2024-04-12 06:08:01 INFO i.a.c.d.j.s.TwoStageSizeEstimator(getTargetBufferByteSize):71 - Max memory limit: 1610612736, JDBC buffer size: 966367642
2024-04-12 06:08:02 source > 2024-04-12 06:08:02 INFO i.a.c.i.s.r.AbstractDbSource(lambda$read$1):184 - Closing database connection pool.
2024-04-12 06:08:02 source > 2024-04-12 06:08:02 INFO c.z.h.HikariDataSource(close):349 - HikariPool-1 - Shutdown initiated...
2024-04-12 06:08:03 source > 2024-04-12 06:08:03 INFO c.z.h.HikariDataSource(close):351 - HikariPool-1 - Shutdown completed.
2024-04-12 06:08:03 source > 2024-04-12 06:08:03 INFO i.a.c.i.s.r.AbstractDbSource(lambda$read$1):186 - Closed database connection pool.
2024-04-12 06:08:03 source > 2024-04-12 06:08:03 INFO i.a.c.i.b.IntegrationRunner(runInternal):231 - Completed integration: io.airbyte.integrations.source.snowflake.SnowflakeSource
2024-04-12 06:08:03 source > 2024-04-12 06:08:03 INFO i.a.c.i.b.a.AdaptiveSourceRunner$Runner(run):88 - Completed source: io.airbyte.integrations.source.snowflake.SnowflakeSource
2024-04-12 06:08:04 platform > Total records read: 32 (4 KB)
2024-04-12 06:08:04 platform > Schema validation was performed to a max of 10 records with errors per stream.
2024-04-12 06:08:04 platform > readFromSource: done. (source.isFinished:true, fromSource.isClosed:false)
2024-04-12 06:08:04 platform > processMessage: done. (fromSource.isDone:true, forDest.isClosed:false)
2024-04-12 06:08:04 platform > thread status... heartbeat thread: false , replication thread: true
2024-04-12 06:08:04 platform > writeToDestination: done. (forDest.isDone:true, isDestRunning:true)
2024-04-12 06:08:04 platform > thread status... timeout thread: false , replication thread: true
2024-04-12 06:08:04 destination > INFO i.a.i.b.FailureTrackingAirbyteMessageConsumer(close):80 Airbyte message consumer: succeeded.
2024-04-12 06:08:04 destination > INFO i.a.i.d.b.BufferedStreamConsumer(close):288 executing on success close procedure.
2024-04-12 06:08:04 destination > INFO i.a.i.d.r.InMemoryRecordBufferingStrategy(flushAllBuffers):85 Flushing TEAM: 29 records (17 KB)
2024-04-12 06:08:04 destination > INFO i.a.i.d.s.DefaultStreamLoader(send):107 Stream Load URL : http://10.134.145.8:8030/api/airbyte/_airbyte_tmp_izn_TEAM/_stream_load
2024-04-12 06:08:04 destination > INFO i.a.i.d.s.DefaultStreamLoader(send):116 Stream loading, label : airbyte__airbyte_tmp_izn_TEAM_501d0197-45f9-44bb-ae68-5a8706705e951712902084430, database : airbyte, table : _airbyte_tmp_izn_TEAM, request : PUT http://10.134.145.8:8030/api/airbyte/_airbyte_tmp_izn_TEAM/_stream_load HTTP/1.1
2024-04-12 06:08:04 destination > ERROR i.a.i.d.s.DefaultStreamLoader(send):163 error response from stream load: 

2024-04-12 06:08:04 destination > ERROR i.a.i.b.AirbyteExceptionHandler(uncaughtException):26 Something went wrong in the connector. See the logs for more details. io.airbyte.integrations.destination.starrocks.exception.StreamLoadFailException: Stream load failed because of unknown exception, db: airbyte, table: _airbyte_tmp_izn_TEAM, label: airbyte__airbyte_tmp_izn_TEAM_501d0197-45f9-44bb-ae68-5a8706705e951712902084430
	at io.airbyte.integrations.destination.starrocks.DefaultStreamLoader.send(DefaultStreamLoader.java:166) ~[io.airbyte.airbyte-integrations.connectors-destination-starrocks-0.50.13.jar:?]
	at io.airbyte.integrations.destination.starrocks.StarRocksBufferedConsumerFactory.lambda$recordWriterFunction$2(StarRocksBufferedConsumerFactory.java:100) ~[io.airbyte.airbyte-integrations.connectors-destination-starrocks-0.50.13.jar:?]
	at io.airbyte.integrations.destination.record_buffer.InMemoryRecordBufferingStrategy.flushAllBuffers(InMemoryRecordBufferingStrategy.java:87) ~[io.airbyte.airbyte-integrations.bases-base-java-0.50.13.jar:?]
	at io.airbyte.integrations.destination.buffered_stream_consumer.BufferedStreamConsumer.close(BufferedStreamConsumer.java:291) ~[io.airbyte.airbyte-integrations.bases-base-java-0.50.13.jar:?]
	at io.airbyte.integrations.base.FailureTrackingAirbyteMessageConsumer.close(FailureTrackingAirbyteMessageConsumer.java:82) ~[io.airbyte.airbyte-integrations.bases-base-java-0.50.13.jar:?]
	at io.airbyte.integrations.base.Destination$ShimToSerializedAirbyteMessageConsumer.close(Destination.java:95) ~[io.airbyte.airbyte-integrations.bases-base-java-0.50.13.jar:?]
	at io.airbyte.integrations.base.IntegrationRunner.runInternal(IntegrationRunner.java:186) ~[io.airbyte.airbyte-integrations.bases-base-java-0.50.13.jar:?]
	at io.airbyte.integrations.base.IntegrationRunner.run(IntegrationRunner.java:123) ~[io.airbyte.airbyte-integrations.bases-base-java-0.50.13.jar:?]
	at io.airbyte.integrations.destination.starrocks.StarRocksDestination.main(StarRocksDestination.java:51) ~[io.airbyte.airbyte-integrations.connectors-destination-starrocks-0.50.13.jar:?]
Caused by: java.lang.NullPointerException: Cannot invoke "io.airbyte.integrations.destination.starrocks.StreamLoadResponse$StreamLoadResponseBody.getStatus()" because "streamLoadBody" is null
	at io.airbyte.integrations.destination.starrocks.DefaultStreamLoader.send(DefaultStreamLoader.java:132) ~[io.airbyte.airbyte-integrations.connectors-destination-starrocks-0.50.13.jar:?]
	... 8 more

Stack Trace: io.airbyte.integrations.destination.starrocks.exception.StreamLoadFailException: Stream load failed because of unknown exception, db: airbyte, table: _airbyte_tmp_izn_TEAM, label: airbyte__airbyte_tmp_izn_TEAM_501d0197-45f9-44bb-ae68-5a8706705e951712902084430
	at io.airbyte.integrations.destination.starrocks.DefaultStreamLoader.send(DefaultStreamLoader.java:166)
	at io.airbyte.integrations.destination.starrocks.StarRocksBufferedConsumerFactory.lambda$recordWriterFunction$2(StarRocksBufferedConsumerFactory.java:100)
	at io.airbyte.integrations.destination.record_buffer.InMemoryRecordBufferingStrategy.flushAllBuffers(InMemoryRecordBufferingStrategy.java:87)
	at io.airbyte.integrations.destination.buffered_stream_consumer.BufferedStreamConsumer.close(BufferedStreamConsumer.java:291)
	at io.airbyte.integrations.base.FailureTrackingAirbyteMessageConsumer.close(FailureTrackingAirbyteMessageConsumer.java:82)
	at io.airbyte.integrations.base.Destination$ShimToSerializedAirbyteMessageConsumer.close(Destination.java:95)
	at io.airbyte.integrations.base.IntegrationRunner.runInternal(IntegrationRunner.java:186)
	at io.airbyte.integrations.base.IntegrationRunner.run(IntegrationRunner.java:123)
	at io.airbyte.integrations.destination.starrocks.StarRocksDestination.main(StarRocksDestination.java:51)
Caused by: java.lang.NullPointerException: Cannot invoke "io.airbyte.integrations.destination.starrocks.StreamLoadResponse$StreamLoadResponseBody.getStatus()" because "streamLoadBody" is null
	at io.airbyte.integrations.destination.starrocks.DefaultStreamLoader.send(DefaultStreamLoader.java:132)
	... 8 more

2024-04-12 06:08:04 platform > readFromDestination: exception caught
io.airbyte.workers.internal.exception.DestinationException: Destination process exited with non-zero exit code 1
	at io.airbyte.workers.general.BufferedReplicationWorker.readFromDestination(BufferedReplicationWorker.java:472) ~[io.airbyte-airbyte-commons-worker-0.57.2.jar:?]
	at io.airbyte.workers.general.BufferedReplicationWorker.lambda$runAsync$2(BufferedReplicationWorker.java:227) ~[io.airbyte-airbyte-commons-worker-0.57.2.jar:?]
	at java.base/java.util.concurrent.CompletableFuture$AsyncRun.run(CompletableFuture.java:1804) ~[?:?]
	at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144) ~[?:?]
	at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642) ~[?:?]
	at java.base/java.lang.Thread.run(Thread.java:1583) [?:?]
2024-04-12 06:08:04 platform > readFromDestination: done. (writeToDestFailed:false, dest.isFinished:true)
2024-04-12 06:08:05 platform > sync summary: {
  "status" : "failed",
  "recordsSynced" : 0,
  "bytesSynced" : 0,
  "startTime" : 1712902070093,
  "endTime" : 1712902085004,
  "totalStats" : {
    "bytesCommitted" : 0,
    "bytesEmitted" : 4541,
    "destinationStateMessagesEmitted" : 0,
    "destinationWriteEndTime" : 0,
    "destinationWriteStartTime" : 1712902070094,
    "meanSecondsBeforeSourceStateMessageEmitted" : 0,
    "maxSecondsBeforeSourceStateMessageEmitted" : 0,
    "maxSecondsBetweenStateMessageEmittedandCommitted" : 0,
    "meanSecondsBetweenStateMessageEmittedandCommitted" : 0,
    "recordsEmitted" : 29,
    "recordsCommitted" : 0,
    "replicationEndTime" : 1712902085004,
    "replicationStartTime" : 1712902070093,
    "sourceReadEndTime" : 1712902084348,
    "sourceReadStartTime" : 1712902070096,
    "sourceStateMessagesEmitted" : 0
  },
  "streamStats" : [ {
    "streamName" : "TEAM",
    "streamNamespace" : "AIRCALL",
    "stats" : {
      "bytesCommitted" : 0,
      "bytesEmitted" : 4541,
      "recordsEmitted" : 29,
      "recordsCommitted" : 0
    }
  } ],
  "performanceMetrics" : {
    "processFromSource" : {
      "elapsedTimeInNanos" : 41167874,
      "executionCount" : 32,
      "avgExecTimeInNanos" : 1286496.0625
    },
    "readFromSource" : {
      "elapsedTimeInNanos" : 14188032526,
      "executionCount" : 2812,
      "avgExecTimeInNanos" : 5045530.77027027
    },
    "processFromDest" : {
      "elapsedTimeInNanos" : 26150,
      "executionCount" : 1,
      "avgExecTimeInNanos" : 26150.0
    },
    "writeToDest" : {
      "elapsedTimeInNanos" : 198806,
      "executionCount" : 29,
      "avgExecTimeInNanos" : 6855.379310344828
    },
    "readFromDest" : {
      "elapsedTimeInNanos" : 14814280926,
      "executionCount" : 279,
      "avgExecTimeInNanos" : 5.309778109677419E7
    }
  }
}

Hi

Any news, solutions on this?

I am having the same error:

cannot invoke "io.airbyte.integrations.destination.starrocks.StreamLoadResponse$StreamLoadResponseBody.getStatus()" because "streamLoadBody" is null

Could you take a look at

select information_schema.be_logs where log rlike ‘airbyte__airbyte_tmp_izn_TEAM_501d0197-45f9-44bb-ae68-5a8706705e951712902084430’;

airbyte__airbyte_tmp_izn_TEAM_501d0197-45f9-44bb-ae68-5a8706705e951712902084430 is the stream load label