mbox series

[FFmpeg-devel,RFC,0/3] Propagate PRFT side data

Message ID 20230921121720.362842-1-peron.clem@gmail.com
Headers show
Series Propagate PRFT side data | expand

Message

Clément Péron Sept. 21, 2023, 12:16 p.m. UTC
Dear FFMpeg contributors,

I'm new to the FFMpeg code base and audio/video sync world, so forgive me in
advance If my questions are a bit dumb.

I have a project where I need to synchronize multiple RTSP cameras with other
network sensors (sync with NTP or PTP).
In my case, I used Ffmpeg to decode the RTSP stream and then output the rawvideo
to the stdout pipe.

After looking in the RTPdec, I found multiple timestamps PST, DTS and also the
PRFT (Producer Reference Timestamp). In my case the PRFT seems the correct one.

After several tests and diggs, I found that the AV_PKT_DATA_PRFT produce by the
RTSP Demux doesn't seems to be forwarded to the encoder/decoder, nor to the
final Muxer.

So I have multiple question:

Is the forward of the AV_PKT_DATA_PRFT the correct solution?

I saw also that Dashenc and Movenc use this Side data but how do they get it?

Actually I have a dirty hack to output PRFT on the stdout, is there something
"more standard" to communicate between Ffmpeg and a python script?

Thanks for your help,
Clément

Clément Péron (3):
  frame: decode: propagate PRFT side data packet to frame
  avcodec: rawenc: Forward PRFT frame data to packet
  HACK: avformat: rawenc: allow to output a raw PRFT

 libavcodec/decode.c      |   1 +
 libavcodec/rawenc.c      |  12 ++++
 libavfilter/f_sidedata.c |   1 +
 libavformat/rawenc.c     | 122 +++++++++++++++++++++++++++++++++++++++
 libavutil/frame.c        |   1 +
 libavutil/frame.h        |   4 ++
 6 files changed, 141 insertions(+)

Comments

Kieran Kunhya Sept. 21, 2023, 1:12 p.m. UTC | #1
On Thu, 21 Sept 2023, 13:17 Clément Péron, <peron.clem@gmail.com> wrote:

> 4I have a project where I need to synchronize multiple RTSP cameras with
> other
> network sensors (sync with NTP or PTP).
>

Just be aware the clock of the vast majority of cameras have no relation to
NTP or PTP so you will have drift and need to handle that (e.g by dropping
or duplicating frames).

Kieran

>
Clément Péron Sept. 21, 2023, 3:41 p.m. UTC | #2
Hi Kieran,

On Thu, 21 Sept 2023 at 15:13, Kieran Kunhya <kierank@obe.tv> wrote:
>
>  On Thu, 21 Sept 2023, 13:17 Clément Péron, <peron.clem@gmail.com> wrote:
>>
>> 4I have a project where I need to synchronize multiple RTSP cameras with other
>> network sensors (sync with NTP or PTP).
>
>
> Just be aware the clock of the vast majority of cameras have no relation to NTP or PTP so you will have drift and need to handle that (e.g by dropping or duplicating frames).

Thanks for pointing this out, and yes I consider each of my sensors
running on a free clock and I recreate a "virtual frame" that is not
correlated to the FPS of each sensor.

Thanks,
Clement

>
> Kieran
Clément Péron Sept. 24, 2023, 9:12 a.m. UTC | #3
Hi,

I plan to resent this series without the latest patch.

Regarding Patch 1 and 2 do you have any comment?

One thing is that unlike the decode.c which has a common
ff_decode_frame_props_from_pkt() there is no such thing for the encode
part. Or maybe I missed it ?

I noticed that the propagation of this data doesn't work when I enable
the hardware Nvidia encoder.

Does it make sense to introduce a ff_encode_packet_props_from_frame()?

Thanks,


On Thu, 21 Sept 2023 at 17:41, Clément Péron <peron.clem@gmail.com> wrote:
>
> Hi Kieran,
>
> On Thu, 21 Sept 2023 at 15:13, Kieran Kunhya <kierank@obe.tv> wrote:
> >
> >  On Thu, 21 Sept 2023, 13:17 Clément Péron, <peron.clem@gmail.com> wrote:
> >>
> >> 4I have a project where I need to synchronize multiple RTSP cameras with other
> >> network sensors (sync with NTP or PTP).
> >
> >
> > Just be aware the clock of the vast majority of cameras have no relation to NTP or PTP so you will have drift and need to handle that (e.g by dropping or duplicating frames).
>
> Thanks for pointing this out, and yes I consider each of my sensors
> running on a free clock and I recreate a "virtual frame" that is not
> correlated to the FPS of each sensor.
>
> Thanks,
> Clement
>
> >
> > Kieran
Clément Péron Oct. 24, 2023, 3:10 p.m. UTC | #4
Hi,

On Sun, 24 Sept 2023 at 11:12, Clément Péron <peron.clem@gmail.com> wrote:
>
> Hi,
>
> I plan to resent this series without the latest patch.
>
> Regarding Patch 1 and 2 do you have any comment?
>
> One thing is that unlike the decode.c which has a common
> ff_decode_frame_props_from_pkt() there is no such thing for the encode
> part. Or maybe I missed it ?
>
> I noticed that the propagation of this data doesn't work when I enable
> the hardware Nvidia encoder.
>
> Does it make sense to introduce a ff_encode_packet_props_from_frame()?

So I investigate this and understand that the cuvid packet has its own
format and it's not capable of forwarding metadata.

So I'm not sure I'm going in the right direction by forwarding the
PRFT all along FFMpeg.

Does it make sense to have the PTS to be an absolute timestamp?

If I look at the RTPdec it seems that everybody expects the ts to be relative.

https://github.com/FFmpeg/FFmpeg/blob/master/libavformat/rtpdec.c#L669-L694

Thanks,



>
> Thanks,
>
>
> On Thu, 21 Sept 2023 at 17:41, Clément Péron <peron.clem@gmail.com> wrote:
> >
> > Hi Kieran,
> >
> > On Thu, 21 Sept 2023 at 15:13, Kieran Kunhya <kierank@obe.tv> wrote:
> > >
> > >  On Thu, 21 Sept 2023, 13:17 Clément Péron, <peron.clem@gmail.com> wrote:
> > >>
> > >> 4I have a project where I need to synchronize multiple RTSP cameras with other
> > >> network sensors (sync with NTP or PTP).
> > >
> > >
> > > Just be aware the clock of the vast majority of cameras have no relation to NTP or PTP so you will have drift and need to handle that (e.g by dropping or duplicating frames).
> >
> > Thanks for pointing this out, and yes I consider each of my sensors
> > running on a free clock and I recreate a "virtual frame" that is not
> > correlated to the FPS of each sensor.
> >
> > Thanks,
> > Clement
> >
> > >
> > > Kieran