From patchwork Thu Jan 4 02:48:12 2024 Content-Type: text/plain; charset="utf-8" MIME-Version: 1.0 Content-Transfer-Encoding: 7bit X-Patchwork-Submitter: Aidan O Connor X-Patchwork-Id: 45475 Delivered-To: ffmpegpatchwork2@gmail.com Received: by 2002:a05:6a20:6623:b0:194:e134:edd4 with SMTP id n35csp7614260pzh; Wed, 3 Jan 2024 18:48:28 -0800 (PST) X-Google-Smtp-Source: AGHT+IFo6WeyaQVB/zcQ2vhWg4huQltJCdYkSNH7tXkAvCF3c10b4hy2i8QQB0h3Xxigsov1VOk2 X-Received: by 2002:a05:600c:3585:b0:40d:87ce:c4c2 with SMTP id p5-20020a05600c358500b0040d87cec4c2mr2791036wmq.100.1704336508512; Wed, 03 Jan 2024 18:48:28 -0800 (PST) Return-Path: Received: from ffbox0-bg.mplayerhq.hu (ffbox0-bg.ffmpeg.org. [79.124.17.100]) by mx.google.com with ESMTP id c2-20020a170906340200b00a26d804d5d4si9353842ejb.873.2024.01.03.18.48.27; Wed, 03 Jan 2024 18:48:28 -0800 (PST) Received-SPF: pass (google.com: domain of ffmpeg-devel-bounces@ffmpeg.org designates 79.124.17.100 as permitted sender) client-ip=79.124.17.100; Authentication-Results: mx.google.com; dkim=neutral (body hash did not verify) header.i=@outlook.com header.s=selector1 header.b=NLQd4YK7; arc=fail (body hash mismatch); spf=pass (google.com: domain of ffmpeg-devel-bounces@ffmpeg.org designates 79.124.17.100 as permitted sender) smtp.mailfrom=ffmpeg-devel-bounces@ffmpeg.org; dmarc=fail (p=NONE sp=QUARANTINE dis=NONE) header.from=outlook.com Received: from [127.0.1.1] (localhost [127.0.0.1]) by ffbox0-bg.mplayerhq.hu (Postfix) with ESMTP id AA73568CCE7; Thu, 4 Jan 2024 04:48:23 +0200 (EET) X-Original-To: ffmpeg-devel@ffmpeg.org Delivered-To: ffmpeg-devel@ffmpeg.org Received: from EUR04-DB3-obe.outbound.protection.outlook.com (mail-db3eur04olkn2090.outbound.protection.outlook.com [40.92.74.90]) by ffbox0-bg.mplayerhq.hu (Postfix) with ESMTPS id E2A2D68CCB4 for ; Thu, 4 Jan 2024 04:48:16 +0200 (EET) ARC-Seal: i=1; a=rsa-sha256; s=arcselector9901; d=microsoft.com; cv=none; b=lMJvLxEeFhEsypYG0TqZpD7dDzTYMyOObAZzdbedOE+Wx8DRnlqZMh7xZB1CsKh8+1u2IMVup3xMSGjmKOyvd+Pi3/csr+w7BfuzH3O8xd0AIB6iwteeJdZTL2JH/weWXvq473xF94M0c9+Ko1yg9808GzsvmkDJE2cOlZCOyDpHOtLlvU0As/T1z8+i1LSWhrUgfc8q5iah8aOHgzWy+Zw/qI/7i/C6hwrrCWX6WnpTVvDFiNj8q6if4Yq20Rb15jRVOek7M9yg5Yn+JhFx9PoOH/jhrQAZ8HjuwPs45NlWJqfwpCrtsNJtVXqGCjArC4aIF1OTbxdGHOSZ8Grbyg== ARC-Message-Signature: i=1; a=rsa-sha256; c=relaxed/relaxed; d=microsoft.com; s=arcselector9901; h=From:Date:Subject:Message-ID:Content-Type:MIME-Version:X-MS-Exchange-AntiSpam-MessageData-ChunkCount:X-MS-Exchange-AntiSpam-MessageData-0:X-MS-Exchange-AntiSpam-MessageData-1; bh=ys0Kfb9n3aCQQxmFuH4dMPK5vUz6DIkZBhrKXUAeu0E=; b=VbbJZkYAQJr7xSYuThaxpnNtLQg0W7A1wmqykWKMb2wazfzHoAZ6VTPAhEyHesanBfjGN/LLwdM4rAcpm1/PdUKEbcEq9RywuGh+EoXXN5kXlE7dvKzAchJk619CPoqBnJGBv2INei3aJ2/t7qMVU9a4wYxqAyd4GCDi++fh7bWjg7h66lP77BNyLawCVxbRn/cMr9u5DbxI35BFf6wxGDxyjUL8Db66QsrQCj+IOI+A8I/ZeQEZDCs7576GoeuIei2nU8nuhHop5MSfNMzBrj9W/yr2Li59AATWR9Uh5EMTZvwanwPs3bc2EJ5UOGggsZaMpjQWB77nf4IaUWk2tw== ARC-Authentication-Results: i=1; mx.microsoft.com 1; spf=none; dmarc=none; dkim=none; arc=none DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=outlook.com; s=selector1; h=From:Date:Subject:Message-ID:Content-Type:MIME-Version:X-MS-Exchange-SenderADCheck; bh=ys0Kfb9n3aCQQxmFuH4dMPK5vUz6DIkZBhrKXUAeu0E=; b=NLQd4YK7fRy+dnIsLVOiKo9doQBCn9az/xEuSlolUbd+ClMLzHNqMeyzS5RCdeDN1pkYYufmJ3kJtZpCqIP/XDEOHRXNnKYgpSua5XiF1Y7ZNYe0i3gHnVz9sEQ3TsFanKhzSTmKToJpf8621P4fc4Pclmup2ZcAfcm0n3sSNn2wSdNEZjURNTEINOrCOz8kAZQPqs1f6Z3KITfr56n36nrn/cZlggx60ejQcxIHp01uAWI9ltmY/DAaD9zc+p/v108BX7ZDfrNo4A/TPivLllQJTzuTDPrrpC23pl3xyF2GWT1pYDcXyZ2+zobuenPw02wHPbYBkblkvBBlCaNghA== Received: from DB7PR09MB2585.eurprd09.prod.outlook.com (2603:10a6:10:4d::24) by DU0PR09MB5975.eurprd09.prod.outlook.com (2603:10a6:10:40d::22) with Microsoft SMTP Server (version=TLS1_2, cipher=TLS_ECDHE_RSA_WITH_AES_256_GCM_SHA384) id 15.20.7159.13; Thu, 4 Jan 2024 02:48:13 +0000 Received: from DB7PR09MB2585.eurprd09.prod.outlook.com ([fe80::6087:1c2f:ec41:8691]) by DB7PR09MB2585.eurprd09.prod.outlook.com ([fe80::6087:1c2f:ec41:8691%4]) with mapi id 15.20.7135.023; Thu, 4 Jan 2024 02:48:12 +0000 From: Aidan O Connor To: "ffmpeg-devel@ffmpeg.org" Thread-Topic: [PATCH] Update xfade filter to support transition between two points in single video stream. Thread-Index: AQHaPrgc36REbYPOX06PDibx+WEr3g== Date: Thu, 4 Jan 2024 02:48:12 +0000 Message-ID: Accept-Language: en-IE, en-US Content-Language: en-IE X-MS-Has-Attach: X-MS-TNEF-Correlator: msip_labels: x-ms-exchange-messagesentrepresentingtype: 1 x-tmn: [3pdcSietfRWncKO8RC9DiCXdBlXoR8Mp] x-ms-publictraffictype: Email x-ms-traffictypediagnostic: DB7PR09MB2585:EE_|DU0PR09MB5975:EE_ x-ms-office365-filtering-correlation-id: 87f4d2a9-65e3-490e-f84a-08dc0ccf97f1 x-microsoft-antispam: BCL:0; x-microsoft-antispam-message-info: RDYryyCKUcuxkp9BIUuK5YvLIGiiBMn+rrjtelvr8dcdgm0xaz8NcCIcXBOWhmURDBqWUHOrjkAg0U6wiGPwLXzyw+CoXQI/F8deMsf1yv3RnarDiuTe21KnBQBYs8gcTNuMNTjuaCT/T3sMiAkQjIX8IOaAno2mhyLFiw84y4cEVI3KCnsgV7XT6NGDHYSm/iVpAO9zQuEjDxZWKv3uePQT2IIOjoP3SflAoTGPQviPsHxhN6Ao19VYqCf1whiaitqkvSv+nKQPB/rlkmEhzuw6+oJV5+7pqGbFOc3KmJLzuvY4zF9Wt/bB3WA2s2i95JfnUua/LeaZQOE+HPbcrW/ZKqDTzvRa5gk9BSLnMqKakJGFeKKH1PFofDQ+C6hMnZDWeyjFHrHGY2dsIUaNj29j0eNAVJ+Mn72s9OUjwyq1XMZaZU80Pl1UbLdXVnuacOBSlScrttDCj2H67f83WYzEHNo7+0/W/0R06I51F2vrGeBspKEfhJMYiMdp84/+/m1ySGkn9myXBsVFx0G4vkJ0nbC4dRWG7Nae3soG46KNEaX5Z0fKuDiROhJqc5XtjFtVotohjhfGzM3ZE80IJw8Mpv1EPVUaNVGn2uwHcUs= x-ms-exchange-antispam-messagedata-chunkcount: 1 x-ms-exchange-antispam-messagedata-0: =?iso-8859-1?q?QTzVgdBi9zVidBwLPnYh+nl?= =?iso-8859-1?q?L3Cdi+EAWnccfwpGFyCWm2frqfXr1CHwaDbGGLRV2T0Kc4qHGfynYqBEgRoo?= =?iso-8859-1?q?IrjG+o2CAhzWfH4CDb4ktKCUImvwGGkG7IzX1twUjSwX1ap/N/AZAFggDT/Q?= =?iso-8859-1?q?nNJ/WxJakB8an6JGxCqGzQCXIFpWA4RStlEmSFonWQvch/Q5KJZG7YNj3Agh?= =?iso-8859-1?q?VuM8AJKokAVBdxmUnOBVzvh3jL3vlQmYBnMVCGyhhKKOJfFob92wx9rL5WWt?= =?iso-8859-1?q?PfEGaqYIKV1oNRZMt9IteoI1A9kBv70474wzEkDFe+eUyUC7cXnK1MVCr5i2?= =?iso-8859-1?q?MfW4O6QSRmjhccVoY4XHlfmfvEoxZglhRuQie2RtgbGjohTytfG9C2YjeEBZ?= =?iso-8859-1?q?PQDy2M8TtuC1DBlCaUZvsKsxc7vpWZCzrHD/6wtsaZPwtOoZJPb5Dw91n67e?= =?iso-8859-1?q?SHpKKA/xG4FBtfTpP4BKZ7P6yESavTI+b0LsRgzYPg5aE6G7qf9lTxZGACpa?= =?iso-8859-1?q?+HIQzgG3tYRwhvkDqqKb3xU4qniiKZHzXQZBuj2NeYs/paUDfkx78P9156nh?= =?iso-8859-1?q?Ce5+WsgHwIyeT4ytBjWlYbkLeMwJD/BVGy9reLd4aCtW6S/MIfsdqN0q92Qr?= =?iso-8859-1?q?LMDjGcj606Yu3aMwVEZjM5eZf/eEgCy376I/YnQQDfQiyS/nD4Czvl2ra48Z?= =?iso-8859-1?q?4RmtzqAkWVOZ6h4/LZeN2LvORiGA9S+7FVn7edI+u++owG2D+sSVc2y/DWzu?= =?iso-8859-1?q?2IbwPCqPZOl2WXEm7AKvcW+0mLXpHJ6xnKU8aHvtcVStDGh3esht7YyHr1rq?= =?iso-8859-1?q?rgv4Wg1SSiCJ5+8p8NgquVzyutiSi8UAAGpEQCDZnOvZrKsqSm4MP6aZsWrM?= =?iso-8859-1?q?nIzaaRh0RYQ83eE82ZFXJjAUjVUidXYwVmXz46o/5Ghnwtx2VXYZ5WgyxVqx?= =?iso-8859-1?q?0TSjIewT/C6fEfI7c+5HYNZQHy8Gax/xkWPqUzzdNmj2uyu0IsAuifIh1VZC?= =?iso-8859-1?q?WEa3KIuxqUnAw37gwGGEDkRs02ZTm990B72G4tsIVPUdAiWPH/gMLgfrA0dx?= =?iso-8859-1?q?wj0yTf8rxzAP7+/RxqbXJvuBrkVr/Dgag9ML2z2PCqQ1/bfSe4iI5dJsEMdl?= =?iso-8859-1?q?5k5XB64O2T6m6kQoVspewunuuPA+TJnhTWY4BmOkEEunoZ9MAQJUyF2mSS36?= =?iso-8859-1?q?h9V1lets8S8NvJmOUiVqFBPaLpHVCp1fB6PBd/pHlEOTRgo6vo4eG?= MIME-Version: 1.0 X-OriginatorOrg: outlook.com X-MS-Exchange-CrossTenant-AuthAs: Internal X-MS-Exchange-CrossTenant-AuthSource: DB7PR09MB2585.eurprd09.prod.outlook.com X-MS-Exchange-CrossTenant-RMS-PersistedConsumerOrg: 00000000-0000-0000-0000-000000000000 X-MS-Exchange-CrossTenant-Network-Message-Id: 87f4d2a9-65e3-490e-f84a-08dc0ccf97f1 X-MS-Exchange-CrossTenant-originalarrivaltime: 04 Jan 2024 02:48:12.6228 (UTC) X-MS-Exchange-CrossTenant-fromentityheader: Hosted X-MS-Exchange-CrossTenant-id: 84df9e7f-e9f6-40af-b435-aaaaaaaaaaaa X-MS-Exchange-CrossTenant-rms-persistedconsumerorg: 00000000-0000-0000-0000-000000000000 X-MS-Exchange-Transport-CrossTenantHeadersStamped: DU0PR09MB5975 X-Content-Filtered-By: Mailman/MimeDel 2.1.29 Subject: [FFmpeg-devel] [PATCH] Update xfade filter to support transition between two points in single video stream. X-BeenThere: ffmpeg-devel@ffmpeg.org X-Mailman-Version: 2.1.29 Precedence: list List-Id: FFmpeg development discussions and patches List-Unsubscribe: , List-Archive: List-Post: List-Help: List-Subscribe: , Reply-To: FFmpeg development discussions and patches Errors-To: ffmpeg-devel-bounces@ffmpeg.org Sender: "ffmpeg-devel" X-TUID: gOw/kVUtqNf+ Signed-off-by: Aidan O'Connor --- libavfilter/vf_xfade.c | 266 +++++++++++++++++++++++++++++++++-------- 1 file changed, 213 insertions(+), 53 deletions(-) -- 2.40.1.windows.1 diff --git a/libavfilter/vf_xfade.c b/libavfilter/vf_xfade.c index 890995a608..65611beeb5 100644 --- a/libavfilter/vf_xfade.c +++ b/libavfilter/vf_xfade.c @@ -93,11 +93,18 @@ enum XFadeTransitions { typedef struct XFadeContext { const AVClass *class; + // Number of inputs. May be 1 for transition within stream or 2 for cross-fade between streams. + int nb_inputs; + int transition; int64_t duration; int64_t offset; char *custom_str; + // Start & end time user options (single input only) + int64_t start; + int64_t end; + int nb_planes; int depth; int is_rgb; @@ -105,12 +112,18 @@ typedef struct XFadeContext { // PTS when the fade should start (in first inputs timebase) int64_t start_pts; + // PTS when the fade should end (single input only) + int64_t end_pts; + // PTS offset between first and second input int64_t inputs_offset_pts; // Duration of the transition int64_t duration_pts; + // Frame duration (single input only) + int64_t frame_duration; + // Current PTS of the first input int64_t pts; @@ -118,6 +131,12 @@ typedef struct XFadeContext { // like before and after the actual transition. int passthrough; + // Copy of transition start frame (single input only) + AVFrame *start_frame; + + // Number of input frames discarded that are to be regenerated for the transition (single input only) + int nb_frames; + int status[2]; AVFrame *xf[2]; int max_value; @@ -169,6 +188,7 @@ static av_cold void uninit(AVFilterContext *ctx) #define FLAGS (AV_OPT_FLAG_FILTERING_PARAM | AV_OPT_FLAG_VIDEO_PARAM) static const AVOption xfade_options[] = { + { "inputs", "set number of inputs", OFFSET(nb_inputs), AV_OPT_TYPE_INT, { .i64 = 2 }, 1, 2, FLAGS }, { "transition", "set cross fade transition", OFFSET(transition), AV_OPT_TYPE_INT, {.i64=FADE}, -1, NB_TRANSITIONS-1, FLAGS, "transition" }, { "custom", "custom transition", 0, AV_OPT_TYPE_CONST, {.i64=CUSTOM}, 0, 0, FLAGS, "transition" }, { "fade", "fade transition", 0, AV_OPT_TYPE_CONST, {.i64=FADE}, 0, 0, FLAGS, "transition" }, @@ -231,6 +251,8 @@ static const AVOption xfade_options[] = { { "revealdown", "reveal down transition", 0, AV_OPT_TYPE_CONST, {.i64=REVEALDOWN}, 0, 0, FLAGS, "transition" }, { "duration", "set cross fade duration", OFFSET(duration), AV_OPT_TYPE_DURATION, {.i64=1000000}, 0, 60000000, FLAGS }, { "offset", "set cross fade start relative to first input stream", OFFSET(offset), AV_OPT_TYPE_DURATION, {.i64=0}, INT64_MIN, INT64_MAX, FLAGS }, + { "start", "set fade start time (single input only)", OFFSET(start), AV_OPT_TYPE_DURATION, {.i64=0}, INT64_MIN, INT64_MAX, FLAGS }, + { "end", "set fade end time (single input only)", OFFSET(end), AV_OPT_TYPE_DURATION, {.i64=0}, INT64_MIN, INT64_MAX, FLAGS }, { "expr", "set expression for custom transition", OFFSET(custom_str), AV_OPT_TYPE_STRING, {.str=NULL}, 0, 0, FLAGS }, { NULL } }; @@ -2039,44 +2061,53 @@ static double b3(void *priv, double x, double y) { return getpix(priv, x, y, 3, static int config_output(AVFilterLink *outlink) { AVFilterContext *ctx = outlink->src; - AVFilterLink *inlink0 = ctx->inputs[0]; - AVFilterLink *inlink1 = ctx->inputs[1]; XFadeContext *s = ctx->priv; + unsigned nb_inputs = s->nb_inputs; + AVFilterLink *inlink0 = ctx->inputs[0]; + AVFilterLink *inlink1 = nb_inputs > 1 ? ctx->inputs[1] : NULL; const AVPixFmtDescriptor *pix_desc = av_pix_fmt_desc_get(inlink0->format); - if (inlink0->w != inlink1->w || inlink0->h != inlink1->h) { - av_log(ctx, AV_LOG_ERROR, "First input link %s parameters " - "(size %dx%d) do not match the corresponding " - "second input link %s parameters (size %dx%d)\n", - ctx->input_pads[0].name, inlink0->w, inlink0->h, - ctx->input_pads[1].name, inlink1->w, inlink1->h); - return AVERROR(EINVAL); - } + if (nb_inputs == 1) { + if (!inlink0->frame_rate.num || !inlink0->frame_rate.den) { + av_log(ctx, AV_LOG_ERROR, "The input needs to be a constant frame rate; " + "current rate of %d/%d is invalid\n", inlink0->frame_rate.num, inlink0->frame_rate.den); + return AVERROR(EINVAL); + } + } else if (nb_inputs == 2) { + if (inlink0->w != inlink1->w || inlink0->h != inlink1->h) { + av_log(ctx, AV_LOG_ERROR, "First input link %s parameters " + "(size %dx%d) do not match the corresponding " + "second input link %s parameters (size %dx%d)\n", + ctx->input_pads[0].name, inlink0->w, inlink0->h, + ctx->input_pads[1].name, inlink1->w, inlink1->h); + return AVERROR(EINVAL); + } - if (inlink0->time_base.num != inlink1->time_base.num || - inlink0->time_base.den != inlink1->time_base.den) { - av_log(ctx, AV_LOG_ERROR, "First input link %s timebase " - "(%d/%d) do not match the corresponding " - "second input link %s timebase (%d/%d)\n", - ctx->input_pads[0].name, inlink0->time_base.num, inlink0->time_base.den, - ctx->input_pads[1].name, inlink1->time_base.num, inlink1->time_base.den); - return AVERROR(EINVAL); - } + if (inlink0->time_base.num != inlink1->time_base.num || + inlink0->time_base.den != inlink1->time_base.den) { + av_log(ctx, AV_LOG_ERROR, "First input link %s timebase " + "(%d/%d) do not match the corresponding " + "second input link %s timebase (%d/%d)\n", + ctx->input_pads[0].name, inlink0->time_base.num, inlink0->time_base.den, + ctx->input_pads[1].name, inlink1->time_base.num, inlink1->time_base.den); + return AVERROR(EINVAL); + } - if (!inlink0->frame_rate.num || !inlink0->frame_rate.den) { - av_log(ctx, AV_LOG_ERROR, "The inputs needs to be a constant frame rate; " - "current rate of %d/%d is invalid\n", inlink0->frame_rate.num, inlink0->frame_rate.den); - return AVERROR(EINVAL); - } + if (!inlink0->frame_rate.num || !inlink0->frame_rate.den) { + av_log(ctx, AV_LOG_ERROR, "The inputs needs to be a constant frame rate; " + "current rate of %d/%d is invalid\n", inlink0->frame_rate.num, inlink0->frame_rate.den); + return AVERROR(EINVAL); + } - if (inlink0->frame_rate.num != inlink1->frame_rate.num || - inlink0->frame_rate.den != inlink1->frame_rate.den) { - av_log(ctx, AV_LOG_ERROR, "First input link %s frame rate " - "(%d/%d) do not match the corresponding " - "second input link %s frame rate (%d/%d)\n", - ctx->input_pads[0].name, inlink0->frame_rate.num, inlink0->frame_rate.den, - ctx->input_pads[1].name, inlink1->frame_rate.num, inlink1->frame_rate.den); - return AVERROR(EINVAL); + if (inlink0->frame_rate.num != inlink1->frame_rate.num || + inlink0->frame_rate.den != inlink1->frame_rate.den) { + av_log(ctx, AV_LOG_ERROR, "First input link %s frame rate " + "(%d/%d) do not match the corresponding " + "second input link %s frame rate (%d/%d)\n", + ctx->input_pads[0].name, inlink0->frame_rate.num, inlink0->frame_rate.den, + ctx->input_pads[1].name, inlink1->frame_rate.num, inlink1->frame_rate.den); + return AVERROR(EINVAL); + } } outlink->w = inlink0->w; @@ -2199,11 +2230,9 @@ static int xfade_slice(AVFilterContext *ctx, void *arg, int jobnr, int nb_jobs) return 0; } -static int xfade_frame(AVFilterContext *ctx, AVFrame *a, AVFrame *b) +static int xfade_frame(AVFilterContext *ctx, AVFrame *a, AVFrame *b, int64_t pts, float progress) { - XFadeContext *s = ctx->priv; AVFilterLink *outlink = ctx->outputs[0]; - float progress = av_clipf(1.f - ((float)(s->pts - s->start_pts) / s->duration_pts), 0.f, 1.f); ThreadData td; AVFrame *out; @@ -2216,7 +2245,7 @@ static int xfade_frame(AVFilterContext *ctx, AVFrame *a, AVFrame *b) ff_filter_execute(ctx, xfade_slice, &td, NULL, FFMIN(outlink->h, ff_filter_get_nb_threads(ctx))); - out->pts = s->pts; + out->pts = pts; return ff_filter_frame(outlink, out); } @@ -2297,6 +2326,7 @@ static int xfade_activate(AVFilterContext *avctx) // We are transitioning, so we need a frame from second input if (ff_inlink_check_available_frame(in_b)) { int ret; + float progress; ff_inlink_consume_frame(avctx->inputs[0], &s->xf[0]); ff_inlink_consume_frame(avctx->inputs[1], &s->xf[1]); @@ -2311,7 +2341,8 @@ static int xfade_activate(AVFilterContext *avctx) ff_inlink_set_status(in_a, AVERROR_EOF); s->passthrough = 1; } - ret = xfade_frame(avctx, s->xf[0], s->xf[1]); + progress = av_clipf(1.f - ((float)(s->pts - s->start_pts) / s->duration_pts), 0.f, 1.f); + ret = xfade_frame(avctx, s->xf[0], s->xf[1], s->pts, progress); av_frame_free(&s->xf[0]); av_frame_free(&s->xf[1]); return ret; @@ -2349,6 +2380,112 @@ static int xfade_activate(AVFilterContext *avctx) return FFERROR_NOT_READY; } +/** + * Perform a transition between two points in the input video stream. + * Transition between the last frame before the specified start time to the first frame after the specified end time. + * All input frames between these points are discarded and replaced with new frames. + */ +static int tfade_activate(AVFilterContext *avctx) +{ + XFadeContext *s = avctx->priv; + AVFilterLink *in = avctx->inputs[0]; + AVFilterLink *outlink = avctx->outputs[0]; + int64_t status_pts; + AVFrame *frame; + AVFrame* end_frame; + int ret; + + FF_FILTER_FORWARD_STATUS_BACK_ALL(outlink, avctx); + + // We did not finish transitioning yet and the stream did not end either, so check if there are more frames to consume. + if (ff_inlink_check_available_frame(in)) { + AVFrame *peeked_frame = ff_inlink_peek_frame(in, 0); + s->pts = peeked_frame->pts; + + // Initiliaze PTS values on first call. + if (s->start_pts == AV_NOPTS_VALUE) { + s->start_pts = s->pts + av_rescale_q(s->start, AV_TIME_BASE_Q, in->time_base); + s->end_pts = s->pts + av_rescale_q(s->end, AV_TIME_BASE_Q, in->time_base); + s->frame_duration = av_rescale_q(1, av_inv_q(in->frame_rate), in->time_base); + av_log(avctx, AV_LOG_INFO, "tfade_activate(); start_pts=%lld, end_pt=%lld, frame_duration=%lld\n", s->start_pts, s->end_pts, s->frame_duration); + } + + if (s->pts <= s->start_pts) { + // Select the last frame before the specified start time + if (s->pts >= s->start_pts && s->pts < (s->start_pts + s->frame_duration)) { + av_log(avctx, AV_LOG_INFO, "tfade_activate(): start frame PTS=%lld\n", s->pts); + s->start_frame = av_frame_clone(peeked_frame); + } + + s->passthrough = 1; + ff_inlink_consume_frame(in, &frame); + return ff_filter_frame(outlink, frame); + } else if (s->pts > s->start_pts && s->pts < s->end_pts) { + // During transition just discard input frame. Count discarded frames so they can be replaced later + s->passthrough = 0; + s->nb_frames++; + + ff_inlink_consume_frame(in, &frame); + ff_inlink_request_frame(in); + return 0; + } else if (s->pts >= s->end_pts) { + // Select the first frame after the specified end time + if (s->pts >= s->end_pts && s->pts < (s->end_pts + s->frame_duration)) { + ff_inlink_consume_frame(in, &end_frame); + s->nb_frames++; + + av_log(avctx, AV_LOG_INFO, "tfade_activate(): End frame PTS=%lld, Number of frames = %d\n", s->pts, s->nb_frames); + + // Replace discarded input frames with transition frames + for (int i = 0; i < s->nb_frames; i++) { + int64_t pts = s->start_pts + (s->frame_duration * i); + float progress = av_clipf(1.f - ((float)i / s->nb_frames), 0.f, 1.f); + ret = xfade_frame(avctx, s->start_frame, end_frame, pts, progress); + } + + av_frame_free(&s->start_frame); + av_frame_free(&end_frame); + + return ret; + } else { + // After the end transiton point just request and formward the input frame + s->passthrough = 1; + ff_inlink_consume_frame(in, &s->start_frame); + return ff_filter_frame(outlink, s->start_frame); + } + } + } + + // We did not get a frame from input, check its status. + if (ff_inlink_acknowledge_status(in, &s->status[0], &status_pts)) { + // Input is EOF so report EOF output. + ff_outlink_set_status(outlink, s->status[0], s->pts); + return 0; + } + + // We have no frames yet from input and no EOF, so request some. + if (ff_outlink_frame_wanted(outlink)) { + ff_inlink_request_frame(in); + return 0; + } + + return FFERROR_NOT_READY; +} + +/** + * Select between the single-stream tfade or dual-stream xfade. + */ +static int activate(AVFilterContext *avctx) +{ + XFadeContext *s = avctx->priv; + if (s->nb_inputs == 1) + return tfade_activate(avctx); + else if (s->nb_inputs == 2) + return xfade_activate(avctx); + else + return AVERROR_BUG; +} + static AVFrame *get_video_buffer(AVFilterLink *inlink, int w, int h) { XFadeContext *s = inlink->dst->priv; @@ -2358,18 +2495,40 @@ static AVFrame *get_video_buffer(AVFilterLink *inlink, int w, int h) ff_default_get_video_buffer(inlink, w, h); } -static const AVFilterPad xfade_inputs[] = { - { - .name = "main", - .type = AVMEDIA_TYPE_VIDEO, - .get_buffer.video = get_video_buffer, - }, - { - .name = "xfade", - .type = AVMEDIA_TYPE_VIDEO, - .get_buffer.video = get_video_buffer, - }, -}; +/** + * Setup the input pads depending on whether single or dual-stream inputs. + */ +static av_cold int init(AVFilterContext *avctx) +{ + XFadeContext *s = avctx->priv; + int ret; + + if (s->nb_inputs == 1) { + AVFilterPad pad = { + .name = "main", + .type = AVMEDIA_TYPE_VIDEO, + .get_buffer = {get_video_buffer} + }; + if ((ret = ff_append_inpad(avctx, &pad)) < 0) + return ret; + + //avctx->filter->activate = tfade_activate; // Is there an API to assign activate function? + } + + if (s->nb_inputs == 2) { + AVFilterPad pad = { + .name = "xfade", + .type = AVMEDIA_TYPE_VIDEO, + .get_buffer = {get_video_buffer} + }; + if ((ret = ff_append_inpad(avctx, &pad)) < 0) + return ret; + + //avctx->filter->activate = xfade_activate; + } + + return 0; +} static const AVFilterPad xfade_outputs[] = { { @@ -2381,13 +2540,14 @@ static const AVFilterPad xfade_outputs[] = { const AVFilter ff_vf_xfade = { .name = "xfade", - .description = NULL_IF_CONFIG_SMALL("Cross fade one video with another video."), + .description = NULL_IF_CONFIG_SMALL("Cross fade one video with another video, or between two points in a single video."), .priv_size = sizeof(XFadeContext), .priv_class = &xfade_class, - .activate = xfade_activate, + .init = init, + .activate = activate, .uninit = uninit, - FILTER_INPUTS(xfade_inputs), + .inputs = NULL, FILTER_OUTPUTS(xfade_outputs), FILTER_PIXFMTS_ARRAY(pix_fmts), - .flags = AVFILTER_FLAG_SLICE_THREADS, + .flags = AVFILTER_FLAG_SLICE_THREADS | AVFILTER_FLAG_DYNAMIC_INPUTS, };