I am able to load data using INSERT at a rate of 2000 rows per sec using Python’s sqlalchemy library. But using Java’s JDBC API i get a rate of 8 rows per sec. Is this expected or there are some known methods to get better?
Are you commuting as single transactions or as a batch?
It’s not recommended to use JDBC to insert data. You can use stream load instead. Stream load is much more efficient.
It is a batch load at an avg of 100,000 rows per INSERT
Hm… that doesn’t make sense. DBeaver uses the JDBC drivers and I’m able to get 100k per second on inserts using a 1.1 GB parquet file. Try with DBeaver DBeaver 23.3.5 | DBeaver Community
hello,is this problem solved now?
Yes, thanks. I optimised the INSERT statement to have bulk records and trigger that. Previously it was being triggered as part of batchUpdate which was doing INSERT per record