Duplicate file name: lake_load_tm:file=hdfs://

Hello

An error has occurred

I need to run the INSERT INTO FILES command twice in folder “s3a://Folder/ABC”

=====> Error: Duplicated field name:lake_load_tm: file=hdfs://

Step-1

INSERT INTO FILES(

“path"=”s3a://Folder/ABC", <===☆☆☆ s3a://Folder/ABC ☆☆☆

“format”=”parquet”

“aws.s3.access_key"=”user",

“aws.s3.secret_key"=”123”,

“aws.s3.region"=”us-west-2”,

“aws.s3.endpoint”=”http://10.50.10.160:30104)

SELECT COL_A From TABLE_A

Step-2)

INSERT INTO FILES(

“path"=”s3a://Folder/ABC", <====☆☆☆ s3a://Folder/ABC ☆☆☆

“format”=”parquet”

“aws.s3.access_key"=”user",

“aws.s3.secret_key"=”123”,

“aws.s3.region"=”us-west-2”,

“aws.s3.endpoint”=”http://10.50.10.160:30104)

SELECT COL_B From TABLE_B