Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

"spark.sql.sources.partitionOverwriteMode": "DYNAMIC" - creates additional tables #1314

Open
MichalBogoryja opened this issue Nov 15, 2024 · 1 comment
Assignees

Comments

@MichalBogoryja
Copy link

When writing a spark dataframe to an existing partitioned BQ table I end up with the table modified in an expected way (partition added/modified). However, the additional table is being saved (it consists of the exact data of the dataframe that I was adding to the other table).
To reproduce:
database state: empty

from pyspark.sql import SparkSession
spark = SparkSession.builder.config("spark.sql.sources.partitionOverwriteMode", "DYNAMIC").config("enableReadSessionCaching", "false").getOrCreate()
spark
sdf.write.format("bigquery").option('partitionField', 'curdate').option('partitionType', 'DAY').mode('overwrite').save(f"{gcp_project_id}.{db}.{table_name}")

database state:
one table named {table_name} - data as in sdf

sdf_2.write.format("bigquery").mode('overwrite').save(f"{gcp_project_id}.{db}.{table_name}")

database state:
one table named {table_name} - data as in sdf with new data from sdf_2 (or if sdf_2 consists of the same partitions as there were in sdf, the original partitions are overwritten)
ADDITIONAL table named {table_name}random_numbers (eg. table_name4467706876500)

Can you modify the saving function to not save this additional table (or drop it after the save process)?

@isha97
Copy link
Member

isha97 commented Feb 3, 2025

Hi @MichalBogoryja, what's the connector version you are using? Please try with the latest connector version 0.41.1.
We do have a cleanup job to delete all the temporary tables created after the application finishes.
Can you check for logs like Running cleanup jobs. Jobs count is in your spark driver logs?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants