-
-
Notifications
You must be signed in to change notification settings - Fork 258
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Bloomfilter for Test-PasswordQuality #146
Comments
Hi @PatchRequest , that sounds like a good idea! |
If time were an issue, I could see this being helpful as a sort of pre-filter. Bloom filter would be much faster to return not in set, and if it returned possibly in set, a follow up lookup in the larger database would drop FPR to zero. That's probably how I'd approach it in my use case. I'd be interested in testing out how much faster I could run this scenario against my dataset. |
I would use speed as a secondary argument i think size is more interesting because with a bloom filter the "bad password list" can fit on any usb stick with a false positive rate of 0.001%: When benchmarking bloomfilters the nice think is they scale with O(1) while binary search is O(log N). Therefore the bigger the password list is the more efficent the bloom filter becomes. Which is a win-win situation I think a parameter called |
Sounds great. I used to store sample databases with git lfs, which was not a good idea. I am considering to do a cleanup and to upload my test ntds.dit files (several GBs) to Azure Blob Storage and to integrate their download into unit test runner. |
mmh an alternative could be to get it hosted somewhere else where there is no quota :/ |
but anyways i will start to develop the features |
For: Test-PasswordQuality
Instead of passing a 30GB file with all hashes a bloomfilter could be created from it and used to check against it.
That would reduce the filesize to around 3GB and would be much faster and more efficent
I could implement such a feature would you be interested?
The text was updated successfully, but these errors were encountered: