-
-
Notifications
You must be signed in to change notification settings - Fork 146
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Duplicare data created by prisma seed #708
Comments
Further to this. If run repeatedly, the prisma seed will eventually fail, because the relationship and event data models combine ids to create a foreign key. The ids are duplicated, so the seed will fail when it hits a certain combination. |
Should we add a command to delete the data first in the seed file to ensure these duplicates don't happen? |
Duplicate of #677 just as an FYI! But thanks for tackling this |
My thought process was that we could run the seed without wiping the data that was created since the previous seed. That is probably overthinking it, so I will implement your suggestion. |
I think we should because if we change the schema you could end up with things out of sync. But I also wonder can we somehow warn the user that they are about to blow their data so it's not a fun surprise 😂 Let's merge what you currently have for now anyway. |
Yes, that makes sense. I will look into in later. |
There is a difference between how Niall and Remos code create the random data in the seed file. When generating the user data, prisma takes care of auto assigning an id. When generating the communities data, an id is assigned in the function.
The chance package is invoked with argument 1, which results in a repeatable source of random generation. Therefore, when creating a new community array, the ids are unique, but the actual data is repeated. This results in duplicated data(with unique ids)
Basically, each time we run ‘prisma db seed’ the community, membership and event data is increased, which exponentially increases the RSVP data.
The text was updated successfully, but these errors were encountered: