Spark connector with Hive/AWS Glue

Hi. I tried save data to S3 via Spark->Starrock Connector->Hive Metastore/AWS Glue->S3

Setup:
CREATE EXTERNAL CATALOG glue
PROPERTIES
(
“type” = “hive”,
“hive.metastore.type” = “glue”,
“aws.s3.use_instance_profile” = “false”,
“aws.s3.access_key” = “<>”,
“aws.s3.secret_key” = “<>”,
“aws.s3.region” = “<>”
);

SET CATALOG glue;
CREATE DATABASE demo PROPERTIES (“location” = “s3a://demo/”);
USE demo;
SHOW CREATE TABLE t_demo;
drop table t_demo force
CREATE TABLE t_demo1
(
x varchar(1),
y varchar(1),
id VARCHAR(36),
date_day DATE
) PARTITION BY (date_day);

In Spark

df.write
.format(“starrocks”)
.option(“starrocks.fe.http.url”, “localhost:8030”)
.option(“starrocks.fe.jdbc.url”, “jdbc:mysql://localhost:9030/glue.demo”)
.option(“starrocks.table.identifier”, “demo.t_demo”)
.option(“starrocks.user”, “demo”)
.option(“starrocks.password”, “password”)
.mode(“append”) // .mode(“overwrite”)
.save()

I got error in Spark:

The last packet sent successfully to the server was 0 milliseconds ago. The driver has not received any packets from the server.
at com.starrocks.connector.spark.sql.connect.StarRocksConnector.extractColumnValuesBySql(StarRocksConnector.java:127)
at com.starrocks.connector.spark.sql.connect.StarRocksConnector.getSchema(StarRocksConnector.java:46)
at com.starrocks.connector.spark.sql.StarRocksTableProvider.getStarRocksSchema(StarRocksTableProvider.java:82)
at com.starrocks.connector.spark.sql.StarRocksTableProvider.inferSchema(StarRocksTableProvider.java:64)
at org.apache.spark.sql.execution.datasources.v2.DataSourceV2Utils$.getTableFromProvider(DataSourceV2Utils.scala:90)
at org.apache.spark.sql.DataFrameWriter.getTable$1(DataFrameWriter.scala:281)
at org.apache.spark.sql.DataFrameWriter.saveInternal(DataFrameWriter.scala:297)
at org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:248)
at com.draivn.datahub.fabric.generic.FabricProcessor.save(FabricProcessor.scala:22)
at com.draivn.datahub.fabric.generic.FabricProcessor.save$(FabricProcessor.scala:12)
at com.draivn.datahub.fabric.synchronize.vehicle.VehicleProcessor$.save(VehicleProcessor.scala:14)
at com.draivn.datahub.fabric.synchronize.vehicle.VehicleProcessor$.save(VehicleProcessor.scala:14)
at com.draivn.datahub.fabric.generic.GenericService.process(GenericService.scala:21)
at com.draivn.datahub.fabric.generic.GenericService.process$(GenericService.scala:13)
at com.draivn.datahub.fabric.DraivnFabricService.process(DraivnFabricService.scala:22)
at com.draivn.datahub.fabric.DraivnFabricExecutor.$anonfun$execute$2(DraivnFabricExecutor.scala:30)
at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
at scala.concurrent.Future$.$anonfun$apply$1(Future.scala:659)
at scala.util.Success.$anonfun$map$1(Try.scala:255)
at scala.util.Success.map(Try.scala:213)
at scala.concurrent.Future.$anonfun$map$1(Future.scala:292)
at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:42)
at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:74)
at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136)
at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)
at java.base/java.lang.Thread.run(Thread.java:840)
Caused by: com.mysql.cj.jdbc.exceptions.CommunicationsException: Communications link failure

I connected jdbc:mysql://localhost:9030/glue.demo using third party tools with any problem

hi andrey, Writing data to external data sources with Spark-StarRocks connector is not supported, could you try insert into as a substitute?