diff --git a/README.md b/README.md index edd0011..aa49cdf 100644 --- a/README.md +++ b/README.md @@ -138,7 +138,7 @@ Therefore, even for the same exact timeframe, you cannot reliably compare a pool Regardless of the diff selected, hashrate measurements based on shares * difficulty are subject to swings based on luck. The lower the share count (higher the diff), the more luck affects the hashrate, and the wilder the swings. Thus, to have a statistically meaningful hashrate measurement, you need enough shares to reduce the effect of luck as much as possible. Neither 5m, nor 30m readings on the ASIC are suitable for this, especially when trying to verify the result of single digit OC changes, and short term pool readings are even worse. -You need 1200 shares just to get to an expected variance of +/- 10% with 99% confidence. E.g. for an expected hashrate of 1TH/s, in 99/100 measurements you will have a reading between 0.9TH/s and 1.1TH/s. You need 4800 shares to reduce that variance to +/- 5%. Many pools are using difficulties that produce share rates in the ~5 shares/min range. Therefore, just to get a hashrate reading with an expected variance of <= +/- 10%, you would need a 1200 / 5 = 240 minute, or 4 hour reading. If you want a reading with an expected variance +/- 5%, you would need over 16 hours of data. You will never be able to confirm the results of an OC level below the expected variance of a given timeframe. For example, you cannot possibly determine whether a 5% OC is working properly in a 4 hr / 1200 share window having 10% expected variance. Even at 16hrs / 4800 shares, the expected variance can completely cancel out a 5% OC. +You need 1200 shares just to get to an expected variance of +/- 10% with 99% confidence. E.g. for an expected hashrate of 1TH/s, in 99/100 measurements after 1200 shares, you will have a reading between 0.9TH/s and 1.1TH/s. You need 4800 shares to reduce that variance to +/- 5%. Many pools are using difficulties that produce share rates in the ~5 shares/min range. Therefore, just to get a hashrate reading with an expected variance of <= +/- 10%, you would need a 1200 / 5 = 240 minute, or 4 hour reading. If you want a reading with an expected variance +/- 5%, you would need over 16 hours of data. You will never be able to confirm the results of an OC level below the expected variance of a given timeframe. For example, you cannot possibly determine whether a 5% OC is working properly in a 4 hr / 1200 share window having 10% expected variance. Even at 16hrs / 4800 shares, the expected variance can completely cancel out a 5% OC. And this leads to the crux of the issue - most pools do not provide anything higher than a 24hr measurement, which at ~5 shares/minute means roughly 7200 shares, which is still a 4% expected variance. You need 10K shares just for 3.3% variance, and about 100K shares for a 1% variance. The only solution then, is to find a pool that lets you set your own difficulty, so that you can generate a statistically relevant number of shares for their available timeframes. While Herominers seems to have this functionality, my recent tests seem to indicate it is not working, at least for a KS3.