You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi, I am looking for a PostgreSQL and MongoDB data dump between 1TB and 30TB. The idea would be to build a pipeline test of RepliByte with a very large dump file. It will help a lot to optimize RepliByte. Anyone?
The text was updated successfully, but these errors were encountered:
I don't know if this suggestion will help you, but an alternate solution would be to create your own dump, if no one can provide one.
A simple script would be able to prepare the db with the amount of data you want.
Hi, I am looking for a PostgreSQL and MongoDB data dump between 1TB and 30TB. The idea would be to build a pipeline test of RepliByte with a very large dump file. It will help a lot to optimize RepliByte. Anyone?
The text was updated successfully, but these errors were encountered: