Duplicate field name: lake_load_tm: file = hdfs://

Hello

An error has occurred.

I need to run the INSERT INTO FILES command twice in folder “s3a://Folder/ABC”

===> Error : Duplicate field name: lake_load_tm: file = hdfs://

Step-1

INSERT INTO FILES(

"path" = "s3a://Folder/ABC", <======== ☆☆☆ s3a://Folder/ABC ☆☆☆

"format" = "parquet",

"aws.s3.access_key" = "user",

"aws.s3.secret_key" = "123!",

"aws.s3.region" = "us-west-2",

"aws.s3.endpoint" = "http://10.50.10.160:30101")

Select ColData_A From Table_A

Step-2

INSERT INTO FILES(

"path" = "s3a://Folder/ABC", <========☆☆☆ s3a://Folder/ABC ☆☆☆

"format" = "parquet",

"aws.s3.access_key" = "user",

"aws.s3.secret_key" = "123!",

"aws.s3.region" = "us-west-2",

"aws.s3.endpoint" = "http://10.50.10.160:30101")

Select ColData_B From Table_B

Please Help Me ㅠㅠㅠㅠ