-
-
Notifications
You must be signed in to change notification settings - Fork 12
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Battery SOC improvements #35
Comments
I did some research into SOC estimation techniques and it looks like the most common approach is coulomb counting, which integrates the current over time and constantly subtracts it from the remaining charge (amp-hour or amp-second), divided by the total charge, resulting in a percentage that is the current state of charge. The accuracy relies on precise battery current measurement and an accurate initial state of charge reading. This method would solve the voltage sag under load issue (since voltage sags under high current) and would also be less affected by temperature (since temperature directly influences current). It also tends to work well with Li-on batteries as opposed to solely voltage due to their relatively flat discharge voltage curves.
I'm still not entirely sure on how to best distinguish between the two battery sizes. You mentioned that a 2.2kwh battery at the same current will drop in voltage faster than the 3.7kwh, so my current idea is to keep track of the voltage derivative and try to guess which size is being used based on the current, voltage derivative, and initial SOC estimate. I think this would require a fair bit of testing and would still be a rough heuristic, but I don't know how else to do it unless there are some key differences between the two battery sizes that would immediately signal which one is in use. Alternatively, some systems do rely on the voltage method for SOC estimation and potentially compensating by some correction term based on the temperature and current. I don't have any way to get the testing data myself though, and I think the temperature correction term would usually be measured directly from the BMS as the battery's internal temperature instead of the ambient environment temperature. Right now I'm leaning towards coulomb counting since current already factors in temperature and isn't impacted by the flat voltage profile of Li-on batteries. |
Thanks for the help here @trebula13579
Yes this does happen in our newer BMS but in the current version the BMS data is only accessible via special Bluetooth app on the users smartphone. In the future we will be integrating with the BMS more directly but for now we can assume we only have access to the ESC data.
Its 3.6v per cell. Here is the datasheet https://www.molicel.com/wp-content/uploads/INR21700P42A-V4-80092.pdf eppg-controller/src/sp140/power.ino Line 6 in 34aa9ae
No, and that is part of the problem. Under load the voltage sags so we get inaccuracies when flying at different power draws.
Ideally the algorithm would be smart enough to figure out the battery size/health based on resting voltage vs sag under load over a certain number of seconds or minutes. However, I anticipated this and have an option (currently not user editable but can be) to help the code distinguish between battery sizes eppg-controller/inc/sp140/structs.h Line 29 in 34aa9ae
|
As of v5.8 we still are relying solely on battery voltage for state of charge (SOC) estimation.
This results in inaccuracies, especially since the voltage to SOC relationship is not linear and batteries sag under load.
We need to get to a more accurate estimator by factoring in at least
Potentially also factoring in
We can only get things like battery cycles/health from the BMS (bluetooth interface) but all of the above are presently available to the controller.
The text was updated successfully, but these errors were encountered: