You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I've tried to subset a Postgres DB of 2GB of data and RepliByte took 38 minutes to complete. I suspect the function subset.postgres.filter_insert_into_rows(..) to be the bottleneck since it is called multiple times and scan the entire file (even if there is a small index).
Something that can be done to drastically reduce the time would be to split the dump into multiple table files. Then scan will be limited to the table.
The text was updated successfully, but these errors were encountered:
I've tried to subset a Postgres DB of 2GB of data and RepliByte took 38 minutes to complete. I suspect the function
subset.postgres.filter_insert_into_rows(..)
to be the bottleneck since it is called multiple times and scan the entire file (even if there is a small index).Something that can be done to drastically reduce the time would be to split the dump into multiple table files. Then scan will be limited to the table.
The text was updated successfully, but these errors were encountered: