Create/Insert Avro data via Java

I’d like to be able to create tables using Java APIs, and batch insert data. We have a streaming application using Kafka/Avro/Schema-Registry. Our topics use:
value.subject.name.strategy: io.confluent.kafka.serializers.subject.TopicRecordNameStrategy

Meaning one topic can have messages with different schemas. It’s critical for us to process messages sequentially as partitioned on the topic, so the topic must support multiple schemas. This is allowed and native to schema-registry, mentioning so I don’t get comments/suggestions on how to change how we distribute messages in kafka as that’s not going to happen.

We do not want to setup a Kafka Connect Cluster due to increased cost/resources when we’re already consuming the data in a streamlined way.

Google search AI Produced this snippet of code that got me excited - but I cannot find any such APIs:

import org.apache.avro.Schema;
import org.apache.avro.generic.GenericData;
import org.apache.starrocks.client.StarRocksClient;

public class CreateTableFromAvroSchema {
    public static void main(String[] args) throws Exception {
        // Create a StarRocks client
        StarRocksClient client = new StarRocksClient("localhost", 8080);

        // Create an Avro schema
        Schema schema = new Schema.Parser().parse(new File("avro_schema.json"));

        // Create a StarRocks table
        client.createTable("example_db", "example_table", schema);

        // Load data into the table
        // ...

        // Close the client
        client.close();
    }
}

This would be great, as all the schema to starrocks column types would be taken care of just using the schema.

Another snippet showed I could maybe batch insert using binary/serialized versions of avro records:

import com.cloudera.starrocks.client.RoutineLoad;
import com.cloudera.starrocks.client.RoutineLoadJob;
import com.cloudera.starrocks.client.RoutineLoadJobBuilder;
import org.apache.avro.Schema;
import org.apache.avro.generic.GenericDatumWriter;
import org.apache.avro.io.BinaryEncoder;
import org.apache.avro.io.DatumWriter;
import org.apache.avro.specific.SpecificDatumWriter;

import java.io.ByteArrayOutputStream;
import java.io.IOException;
import java.util.List;

public class InsertAvroRecordsIntoStarRocks {

    public static void main(String[] args) throws IOException {
        // Create a StarRocks table
        String table = "user";

        // Create an Avro schema
        Schema schema = new Schema.Parser().parse(new File("user.avsc"));

        // Generate Java code from the Avro schema
        SpecificDatumWriter<User> writer = new SpecificDatumWriter<>(schema);

        // Serialize the Avro record to a byte array
        ByteArrayOutputStream out = new ByteArrayOutputStream();
        BinaryEncoder encoder = new BinaryEncoder(out);
        writer.write(user, encoder);
        encoder.flush();

        // Insert the byte array into StarRocks
        RoutineLoadJob job = new RoutineLoadJobBuilder()
                .table(table)
                .format("avro")
                .data(out.toByteArray())
                .build();

        RoutineLoad routineLoad = new RoutineLoad();
        routineLoad.submitJob(job);
    }
}

Are there any APIs available such that I could do these create/inserts directly into starrocks?

Thanks!

I don’t think so and Google AI just hallucinates with api that don’t exist.

I think your easiest way is to write code that just does an insert. If possible do a batch SQL insert for better performance.