-
Notifications
You must be signed in to change notification settings - Fork 1.6k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Synchronization between multiple streamed audio RTP and low latency. #1091
Comments
If we set the About accessing the NTP timestamps, that sounds like a question about FFmpeg. You'll probably get more feedback by asking upstream about that. I wouldn't count on FFmpeg offering this kind of thing though, see issue #996. |
Well, it won't be exactly the same, you have audio streams, but yes that's
the general idea.
|
@saudet The implementation proposed at lu-zero/bmdtools#58 can be done starting on line 767 and adding this:
Then in line 783 adding this:
|
@saudet I think you have far more experience than do you think is possible to synchronize multiple FFmpegFrameGrabber instance of RTSP streams? |
If we have access to accurate timestamps, sure, that sounds good. |
@saudet Ok I'm trying to do that I mean when I call So if we have three steams A, B, and C and A starts first then B after 60000 ms and C after 180000 ms. There is any way to get access to the start_time_realtime variable? |
We'll probably need to enhance FFmpeg first. I believe there is a patch for this at bytedeco/javacpp-presets#374, so we could try to apply it and see what that gives. Let me know if you have any questions and I will help. Thanks! |
@srajca The patch is already available for download. Feel free to try it anytime you want! |
Hello, everyone I'm using JavaCV for almost 5 months I was able to implement a basic application that receives multiple RTSP streams from Android phones as shown in this image.
My implementation creates a thread for each device and instantiate a
FFmpegFrameGrabber
object with the RTSP address and port to the device stream. The delay is noticeable and varies between 0.8 ~ 1.2 seconds approximately.So I have two questions. It is possible to achieve real-time stream or reduce the delay to 0.2 seconds?
I can achieve latency (between 0.2 to 0.3 seconds) when I run this command:
For example the AudioPlayer has this code:
And my second question is It is possible to synchronize these RTP streams? probably using the NTP timestamp? However, the NTP timestamp is in the Sender Report RTCP Packet can I have access to that packet from the FFmpegFrameGrabber??
The text was updated successfully, but these errors were encountered: