You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The synchronization of the different loop cycles and players is currently (as of 2b23f37) done via sending a periodic audio signal over one of the channels in the audio stream, and detecting the cross-correlation between these signals. This is a hack. (Maybe a common one, similarly mentioned in a video setting on https://stackoverflow.com/a/35572224 in 2016.)
To Be
Synchronization between the loop cycles and players should be done in a solid and resource-friendly way.
There has been a recent (Apr 7, 2020) question https://stackoverflow.com/q/61072503 on stackoverflow which, if answered, would provide exactly what is needed here.
There is a nice project webtiming/timinsrc on GitHub (https://github.com/webtiming/timingsrc/). However, this only works on the output side: Once a media stream is tagged with precise information which frame or sample corresponds to which point in time, this can adjust the player to ensure playback is in sync. This works well with playback of recorded content in files. However, it dies not solve the problem for live streams how to tag each sample with the point in time to which it corresponds.
The text was updated successfully, but these errors were encountered:
As Is
The synchronization of the different loop cycles and players is currently (as of 2b23f37) done via sending a periodic audio signal over one of the channels in the audio stream, and detecting the cross-correlation between these signals. This is a hack. (Maybe a common one, similarly mentioned in a video setting on https://stackoverflow.com/a/35572224 in 2016.)
To Be
Synchronization between the loop cycles and players should be done in a solid and resource-friendly way.
It is not clear how to do this best.
The text was updated successfully, but these errors were encountered: