From patchwork Tue Jan 16 22:52:51 2024 Content-Type: text/plain; charset="utf-8" MIME-Version: 1.0 Content-Transfer-Encoding: 7bit X-Patchwork-Submitter: Aidan O Connor X-Patchwork-Id: 45619 Delivered-To: ffmpegpatchwork2@gmail.com Received: by 2002:a05:6a20:c58a:b0:199:de12:6fa6 with SMTP id gn10csp5838pzb; Tue, 16 Jan 2024 14:53:08 -0800 (PST) X-Google-Smtp-Source: AGHT+IGAdA+9SsM7gsu0Mk7l48k9Bnq79j+mSRbvs/z50B3wes0n3GEisi8rKnScN9xtTi5FgoK9 X-Received: by 2002:a05:6512:3b91:b0:50e:6f89:a149 with SMTP id g17-20020a0565123b9100b0050e6f89a149mr2071600lfv.11.1705445588649; Tue, 16 Jan 2024 14:53:08 -0800 (PST) Return-Path: Received: from ffbox0-bg.mplayerhq.hu (ffbox0-bg.ffmpeg.org. [79.124.17.100]) by mx.google.com with ESMTP id g6-20020a1709061c8600b00a26cdacd621si4959027ejh.544.2024.01.16.14.53.05; Tue, 16 Jan 2024 14:53:08 -0800 (PST) Received-SPF: pass (google.com: domain of ffmpeg-devel-bounces@ffmpeg.org designates 79.124.17.100 as permitted sender) client-ip=79.124.17.100; Authentication-Results: mx.google.com; dkim=neutral (body hash did not verify) header.i=@outlook.com header.s=selector1 header.b=lP54h29N; arc=fail (body hash mismatch); spf=pass (google.com: domain of ffmpeg-devel-bounces@ffmpeg.org designates 79.124.17.100 as permitted sender) smtp.mailfrom=ffmpeg-devel-bounces@ffmpeg.org; dmarc=fail (p=NONE sp=QUARANTINE dis=NONE) header.from=outlook.com Received: from [127.0.1.1] (localhost [127.0.0.1]) by ffbox0-bg.mplayerhq.hu (Postfix) with ESMTP id 27FE168D079; Wed, 17 Jan 2024 00:53:02 +0200 (EET) X-Original-To: ffmpeg-devel@ffmpeg.org Delivered-To: ffmpeg-devel@ffmpeg.org Received: from EUR04-VI1-obe.outbound.protection.outlook.com (mail-vi1eur04olkn2025.outbound.protection.outlook.com [40.92.75.25]) by ffbox0-bg.mplayerhq.hu (Postfix) with ESMTPS id 30A7868CEC8 for ; Wed, 17 Jan 2024 00:52:55 +0200 (EET) ARC-Seal: i=1; a=rsa-sha256; s=arcselector9901; d=microsoft.com; cv=none; b=cqPzw2dXF/haHnkLevmL5Gk7napvFxuWXxrY8zqYdnuIyQ32G1E1HT5vmUKFweTG0OQUa2cOAZe9Crs05qVHJX//GPJeoF+pMwYvdGkg9sXolnHYfSyBOEaTFdavpAJulNo9LymHhcv1lNb2FaN42xX5NN82R9t6VtBJfiXdy0qKgRG5b3irS7jTV6w1EpbHXy/m2toPYTe1/VuQNITpFsT7W4tCFlp2UqIRqHd+Z3BtBb6aVfjkrMsw7pr+JY83ur5rNLMwzOtZDk87KsVYa7xxZ9rwRbc78Fp/bk1M0L0O/OqIGhL9Qv4VI78KHWXPCoWbUzB5RkowZ1ePknvh8A== ARC-Message-Signature: i=1; a=rsa-sha256; c=relaxed/relaxed; d=microsoft.com; s=arcselector9901; h=From:Date:Subject:Message-ID:Content-Type:MIME-Version:X-MS-Exchange-AntiSpam-MessageData-ChunkCount:X-MS-Exchange-AntiSpam-MessageData-0:X-MS-Exchange-AntiSpam-MessageData-1; bh=d6Z/xcfB/uJsqeMGiH1s5QaYGEAO7XJqsvGOxmw/XGs=; b=bKuOD4ocWjSIBj/g6z8G9ZIBmdrciUtKcLnmbO9r6M85KUhMS5UgJAobciYPHf+YZbB0wfyYHo08OFPRtPg/1VGgPf7ay8nQxhBPLGRhhwJk065zreojdXF8aOa2Rp6CY5S0pexe1+qYENalbfvbJL7G3RZ1pNeF6rh8taHjOEGL+dY1/T/ySZroz/StByZA5inVMO2jTsjktsaB03u5V5uQQmzbG/h+8pAT6JCNlIZQrX0iOCG6hwX0ezSKQa0XamtH6Dm4sAViJcekdQJ079UAM6F8pHL9BMwghtB9g7wTo9NMMg4Bz/qPA3DkWZHF/CVhzR7sptghUMsmoy4tBw== ARC-Authentication-Results: i=1; mx.microsoft.com 1; spf=none; dmarc=none; dkim=none; arc=none DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=outlook.com; s=selector1; h=From:Date:Subject:Message-ID:Content-Type:MIME-Version:X-MS-Exchange-SenderADCheck; bh=d6Z/xcfB/uJsqeMGiH1s5QaYGEAO7XJqsvGOxmw/XGs=; b=lP54h29Nnn8vM3NvGcTuDJtbSgH0fhGVTr0UQgC3MUD+Cs/SW1ZqwIvUC3HX4pmUMF/J/XsuFXTF0VpkkOAuP5WuFU8NCtG4k0Vr2gYXFPP+P8r4pjqgjr0VlTpKzQ7RKfibXBuvb0GbUiaeuJnPl4yPDSHbrSa8x9a9uaqTsoSujHvPl97b6jSsI5VdkRKRq7ir1cUvs+ZUs9CgWuyfZzywGclXRWW4pDobs81VXPo8CNJKE9mZ+QVCBxo2BfQzgXIQA6KeegDCbzdV3W7ImH1gFb0Rqnzcw6WdQ4mrb0aWBjVex+s4NYOwYh2/FTQhizQdpySGy69W+po4V/MN/A== Received: from DB7PR09MB2585.eurprd09.prod.outlook.com (2603:10a6:10:4d::24) by GVXPR09MB6724.eurprd09.prod.outlook.com (2603:10a6:150:114::11) with Microsoft SMTP Server (version=TLS1_2, cipher=TLS_ECDHE_RSA_WITH_AES_256_GCM_SHA384) id 15.20.7181.28; Tue, 16 Jan 2024 22:52:52 +0000 Received: from DB7PR09MB2585.eurprd09.prod.outlook.com ([fe80::3d7d:cb44:e94d:78dc]) by DB7PR09MB2585.eurprd09.prod.outlook.com ([fe80::3d7d:cb44:e94d:78dc%4]) with mapi id 15.20.7202.020; Tue, 16 Jan 2024 22:52:52 +0000 From: Aidan O Connor To: ffmpeg-devel Thread-Topic: [PATCH] avfilter/vf_xfade: Update to support transition between two points in single video stream. Thread-Index: AQHaSM22T5kp/UOKZ0SmpmD0zOTwMg== Date: Tue, 16 Jan 2024 22:52:51 +0000 Message-ID: Accept-Language: en-IE, en-US Content-Language: en-IE X-MS-Has-Attach: X-MS-TNEF-Correlator: msip_labels: x-ms-exchange-messagesentrepresentingtype: 1 x-tmn: [ewAeTKr/1GbeRq5Vs5Q4Yjf3BCpdgYmI] x-ms-publictraffictype: Email x-ms-traffictypediagnostic: DB7PR09MB2585:EE_|GVXPR09MB6724:EE_ x-ms-office365-filtering-correlation-id: 9c8be1b5-653c-4e08-a314-08dc16e5de71 x-microsoft-antispam: BCL:0; x-microsoft-antispam-message-info: POLOTEfX8HHTjOsC1K2UxqbDXthL8JAS08rA12QIwd0kip5aFGym9fsgSWfEsaTRuT+PwAP6rYORiUcXMMjPtJuQEGUYpanhY+N3SJUywcFFRt7yel7wFzZfk+dMWhGuau5d3N7D0dxaDWN5muLYyJN7/fQDQYEdtZarSuDseRaZrbycpskYmwaVkBW8kCnY6N8t4BSb57SqZvbKAh+TxWQorNu14lv6Ou5mC7dwWVYzE6cFoetj8BzIt3UWeWRJ14waEwKxtfNUREF25ScCRkMusVhRgX0DxfEM8RS5H0hGJcWjm19N7ILMf7gv0amp7tkO8FvKwDi7fpP1xGtKPO/zvz5TfzcOMpQ0bZZwCCEufvZ2ePMaCkVMG4TWKZjmYS+eQVi57AhPbjaBuyqbWa68unq0cU6rkJvGrwjG+A8QfK/gO0fACNiR9YMUWkFWM7vWO3ow5KD2P2T63Fk8ITndKG8uuYjaLzAFnJNoQ3rQNF3VK9V8o25bfGCUAKc8vrppXij7pEAEnHWxsw9/sVhKnodr6cVJzvRDTNn2BQwkRIl4IrKZnX5TU8vovlc8/RK3uwJ7+5ZInjx/Sdu9AM27tC84gANWiteMg8sGNKw= x-ms-exchange-antispam-messagedata-chunkcount: 1 x-ms-exchange-antispam-messagedata-0: =?iso-8859-1?q?bMRm09Xxjkgwhq0JLpBav4V?= =?iso-8859-1?q?pzCVP9WLcXKm14aC9vUUwso3u7gQB0TGQShBPDz3yNnKYw85oKVeSNlp+hwI?= =?iso-8859-1?q?KKhlqFUJNx7LtpgecgGeMzyq9p6gjJZ7MzEfpUwNaxu4DUlV9dVjWYGEH7Mj?= =?iso-8859-1?q?NL05HH78Ba3XBswJWMgY1T6cm1jj/T5L8yqORkFHf7CK6Qyu5qayv+hJlB35?= =?iso-8859-1?q?qtI6EC8uOu9HYO/TqeGovmTB6tT+4fOuJLf2RUpv3eC+9HFsBmQBosUp14QY?= =?iso-8859-1?q?z3R/9u0AlSipTdXne8IXRWKzFVirrDThShJwGHzA4Go2EvyGNNTTVFN4WXCs?= =?iso-8859-1?q?w2sbgAvKkLsJ1DaZVUcHNwmM1bsfmfsYWSlTCwCxEJf11yFUUaZ/B3gMsnR5?= =?iso-8859-1?q?HzHuVpXGS2MMUO9qOz5wLaPXB8ytaHAOKowYxJ3KNOc5l/KvQrzLEbAro+JC?= =?iso-8859-1?q?vy4KxVGVS0j0fIwNSY3iwEmALwC4ESCfgZtDWfzrhnb30gcmCknd6BrqCoz5?= =?iso-8859-1?q?c04gUJAWZAGefCcfdiujkXOuIwYT6Ob4YyMeju6uRSHVrn1Hr0gOT6SFroeS?= =?iso-8859-1?q?EGf10s18I4oO9nrBuDaPq3aBjSN5GVO2w4hkkXBm7jmxw+O2xi8J2FYSBTBF?= =?iso-8859-1?q?Ysx6Wy0xDotF0Kgsmq7bIC9sabXR+swGeK5Sd2K1IjOZ/9mxdf2QjHxGmrgR?= =?iso-8859-1?q?XkoNRzgm/ZPOq6GH+deSs6Rdvfbg4DcQm1YqfSh4+9jhNk+Y1lv1J7VVmnXa?= =?iso-8859-1?q?5/GH+KJNEvY2dv3iCK38kRqXHqjMFdpHNslWT4/LfKlpkJhMkS3Uq9k2pD8K?= =?iso-8859-1?q?eQdweIoqwc9El5gciEg7AnTot2obh+Pa+ltYzHNlj58X1WmglSbsGB4BGoFC?= =?iso-8859-1?q?vYOBsVrYxdsa9bRmbcFMMJDOEpvazlhjSc2Q/2LlznRHSvA3ITpywRwUVLci?= =?iso-8859-1?q?GS3OKBvG/Ve7oTr6Y+dB4fOVwzgjhwhmUA4Edm+kWxvZVeYBaW+017PWjnh9?= =?iso-8859-1?q?K2+L/OoVbVhFSgo+1LDX0mMMLXX0jOAjdDZ0r88fsIKIfjg0JTQl4vtA1jBG?= =?iso-8859-1?q?qsHAkTwjSAworbCnYyAsdsZ2eLpjJKZxWSCxbwNroddRKMvpvwL8jIAD60n/?= =?iso-8859-1?q?/fqqUW3QPv0olBDJTuC+ZaPOd+9lRe5BXfjXcNqTGufNGjgndqda9QxeAzp4?= =?iso-8859-1?q?wh53zUPRaBKVr6pBYRl41jCmfGLXgq3sWYfbGR5CODTDT/C/vJoZy?= MIME-Version: 1.0 X-OriginatorOrg: outlook.com X-MS-Exchange-CrossTenant-AuthAs: Internal X-MS-Exchange-CrossTenant-AuthSource: DB7PR09MB2585.eurprd09.prod.outlook.com X-MS-Exchange-CrossTenant-RMS-PersistedConsumerOrg: 00000000-0000-0000-0000-000000000000 X-MS-Exchange-CrossTenant-Network-Message-Id: 9c8be1b5-653c-4e08-a314-08dc16e5de71 X-MS-Exchange-CrossTenant-originalarrivaltime: 16 Jan 2024 22:52:51.4953 (UTC) X-MS-Exchange-CrossTenant-fromentityheader: Hosted X-MS-Exchange-CrossTenant-id: 84df9e7f-e9f6-40af-b435-aaaaaaaaaaaa X-MS-Exchange-CrossTenant-rms-persistedconsumerorg: 00000000-0000-0000-0000-000000000000 X-MS-Exchange-Transport-CrossTenantHeadersStamped: GVXPR09MB6724 X-Content-Filtered-By: Mailman/MimeDel 2.1.29 Subject: [FFmpeg-devel] [PATCH] avfilter/vf_xfade: Update to support transition between two points in single video stream. X-BeenThere: ffmpeg-devel@ffmpeg.org X-Mailman-Version: 2.1.29 Precedence: list List-Id: FFmpeg development discussions and patches List-Unsubscribe: , List-Archive: List-Post: List-Help: List-Subscribe: , Reply-To: FFmpeg development discussions and patches Errors-To: ffmpeg-devel-bounces@ffmpeg.org Sender: "ffmpeg-devel" X-TUID: 0PLHBY0Lvbck This patch replaces my previous patch from Jan. 4 2024, which had bugs. Signed-off-by: Aidan O'Connor --- libavfilter/vf_xfade.c | 260 ++++++++++++++++++++++++++++++++--------- 1 file changed, 207 insertions(+), 53 deletions(-) -- 2.40.1.windows.1 diff --git a/libavfilter/vf_xfade.c b/libavfilter/vf_xfade.c index 890995a608..f50ae1ada1 100644 --- a/libavfilter/vf_xfade.c +++ b/libavfilter/vf_xfade.c @@ -93,11 +93,18 @@ enum XFadeTransitions { typedef struct XFadeContext { const AVClass *class; + // Number of inputs. May be 1 for transition within stream or 2 for cross-fade between streams. + int nb_inputs; + int transition; int64_t duration; int64_t offset; char *custom_str; + // Start & end time user options (single input only) + int64_t start; + int64_t end; + int nb_planes; int depth; int is_rgb; @@ -105,12 +112,18 @@ typedef struct XFadeContext { // PTS when the fade should start (in first inputs timebase) int64_t start_pts; + // PTS when the fade should end (single input only) + int64_t end_pts; + // PTS offset between first and second input int64_t inputs_offset_pts; // Duration of the transition int64_t duration_pts; + // Frame duration (single input only) + int64_t frame_duration; + // Current PTS of the first input int64_t pts; @@ -118,6 +131,12 @@ typedef struct XFadeContext { // like before and after the actual transition. int passthrough; + // Copy of transition start frame (single input only) + AVFrame *start_frame; + + // Number of input frames discarded that are to be regenerated for the transition (single input only) + int nb_frames; + int status[2]; AVFrame *xf[2]; int max_value; @@ -169,6 +188,7 @@ static av_cold void uninit(AVFilterContext *ctx) #define FLAGS (AV_OPT_FLAG_FILTERING_PARAM | AV_OPT_FLAG_VIDEO_PARAM) static const AVOption xfade_options[] = { + { "inputs", "set number of inputs", OFFSET(nb_inputs), AV_OPT_TYPE_INT, { .i64 = 2 }, 1, 2, FLAGS }, { "transition", "set cross fade transition", OFFSET(transition), AV_OPT_TYPE_INT, {.i64=FADE}, -1, NB_TRANSITIONS-1, FLAGS, "transition" }, { "custom", "custom transition", 0, AV_OPT_TYPE_CONST, {.i64=CUSTOM}, 0, 0, FLAGS, "transition" }, { "fade", "fade transition", 0, AV_OPT_TYPE_CONST, {.i64=FADE}, 0, 0, FLAGS, "transition" }, @@ -231,6 +251,8 @@ static const AVOption xfade_options[] = { { "revealdown", "reveal down transition", 0, AV_OPT_TYPE_CONST, {.i64=REVEALDOWN}, 0, 0, FLAGS, "transition" }, { "duration", "set cross fade duration", OFFSET(duration), AV_OPT_TYPE_DURATION, {.i64=1000000}, 0, 60000000, FLAGS }, { "offset", "set cross fade start relative to first input stream", OFFSET(offset), AV_OPT_TYPE_DURATION, {.i64=0}, INT64_MIN, INT64_MAX, FLAGS }, + { "start", "set fade start time (single input only)", OFFSET(start), AV_OPT_TYPE_DURATION, {.i64=0}, INT64_MIN, INT64_MAX, FLAGS }, + { "end", "set fade end time (single input only)", OFFSET(end), AV_OPT_TYPE_DURATION, {.i64=0}, INT64_MIN, INT64_MAX, FLAGS }, { "expr", "set expression for custom transition", OFFSET(custom_str), AV_OPT_TYPE_STRING, {.str=NULL}, 0, 0, FLAGS }, { NULL } }; @@ -2039,44 +2061,53 @@ static double b3(void *priv, double x, double y) { return getpix(priv, x, y, 3, static int config_output(AVFilterLink *outlink) { AVFilterContext *ctx = outlink->src; - AVFilterLink *inlink0 = ctx->inputs[0]; - AVFilterLink *inlink1 = ctx->inputs[1]; XFadeContext *s = ctx->priv; + unsigned nb_inputs = s->nb_inputs; + AVFilterLink *inlink0 = ctx->inputs[0]; + AVFilterLink *inlink1 = nb_inputs > 1 ? ctx->inputs[1] : NULL; const AVPixFmtDescriptor *pix_desc = av_pix_fmt_desc_get(inlink0->format); - if (inlink0->w != inlink1->w || inlink0->h != inlink1->h) { - av_log(ctx, AV_LOG_ERROR, "First input link %s parameters " - "(size %dx%d) do not match the corresponding " - "second input link %s parameters (size %dx%d)\n", - ctx->input_pads[0].name, inlink0->w, inlink0->h, - ctx->input_pads[1].name, inlink1->w, inlink1->h); - return AVERROR(EINVAL); - } + if (nb_inputs == 1) { + if (!inlink0->frame_rate.num || !inlink0->frame_rate.den) { + av_log(ctx, AV_LOG_ERROR, "The input needs to be a constant frame rate; " + "current rate of %d/%d is invalid\n", inlink0->frame_rate.num, inlink0->frame_rate.den); + return AVERROR(EINVAL); + } + } else if (nb_inputs == 2) { + if (inlink0->w != inlink1->w || inlink0->h != inlink1->h) { + av_log(ctx, AV_LOG_ERROR, "First input link %s parameters " + "(size %dx%d) do not match the corresponding " + "second input link %s parameters (size %dx%d)\n", + ctx->input_pads[0].name, inlink0->w, inlink0->h, + ctx->input_pads[1].name, inlink1->w, inlink1->h); + return AVERROR(EINVAL); + } - if (inlink0->time_base.num != inlink1->time_base.num || - inlink0->time_base.den != inlink1->time_base.den) { - av_log(ctx, AV_LOG_ERROR, "First input link %s timebase " - "(%d/%d) do not match the corresponding " - "second input link %s timebase (%d/%d)\n", - ctx->input_pads[0].name, inlink0->time_base.num, inlink0->time_base.den, - ctx->input_pads[1].name, inlink1->time_base.num, inlink1->time_base.den); - return AVERROR(EINVAL); - } + if (inlink0->time_base.num != inlink1->time_base.num || + inlink0->time_base.den != inlink1->time_base.den) { + av_log(ctx, AV_LOG_ERROR, "First input link %s timebase " + "(%d/%d) do not match the corresponding " + "second input link %s timebase (%d/%d)\n", + ctx->input_pads[0].name, inlink0->time_base.num, inlink0->time_base.den, + ctx->input_pads[1].name, inlink1->time_base.num, inlink1->time_base.den); + return AVERROR(EINVAL); + } - if (!inlink0->frame_rate.num || !inlink0->frame_rate.den) { - av_log(ctx, AV_LOG_ERROR, "The inputs needs to be a constant frame rate; " - "current rate of %d/%d is invalid\n", inlink0->frame_rate.num, inlink0->frame_rate.den); - return AVERROR(EINVAL); - } + if (!inlink0->frame_rate.num || !inlink0->frame_rate.den) { + av_log(ctx, AV_LOG_ERROR, "The inputs needs to be a constant frame rate; " + "current rate of %d/%d is invalid\n", inlink0->frame_rate.num, inlink0->frame_rate.den); + return AVERROR(EINVAL); + } - if (inlink0->frame_rate.num != inlink1->frame_rate.num || - inlink0->frame_rate.den != inlink1->frame_rate.den) { - av_log(ctx, AV_LOG_ERROR, "First input link %s frame rate " - "(%d/%d) do not match the corresponding " - "second input link %s frame rate (%d/%d)\n", - ctx->input_pads[0].name, inlink0->frame_rate.num, inlink0->frame_rate.den, - ctx->input_pads[1].name, inlink1->frame_rate.num, inlink1->frame_rate.den); - return AVERROR(EINVAL); + if (inlink0->frame_rate.num != inlink1->frame_rate.num || + inlink0->frame_rate.den != inlink1->frame_rate.den) { + av_log(ctx, AV_LOG_ERROR, "First input link %s frame rate " + "(%d/%d) do not match the corresponding " + "second input link %s frame rate (%d/%d)\n", + ctx->input_pads[0].name, inlink0->frame_rate.num, inlink0->frame_rate.den, + ctx->input_pads[1].name, inlink1->frame_rate.num, inlink1->frame_rate.den); + return AVERROR(EINVAL); + } } outlink->w = inlink0->w; @@ -2199,11 +2230,9 @@ static int xfade_slice(AVFilterContext *ctx, void *arg, int jobnr, int nb_jobs) return 0; } -static int xfade_frame(AVFilterContext *ctx, AVFrame *a, AVFrame *b) +static int xfade_frame(AVFilterContext *ctx, AVFrame *a, AVFrame *b, int64_t pts, float progress) { - XFadeContext *s = ctx->priv; AVFilterLink *outlink = ctx->outputs[0]; - float progress = av_clipf(1.f - ((float)(s->pts - s->start_pts) / s->duration_pts), 0.f, 1.f); ThreadData td; AVFrame *out; @@ -2216,7 +2245,7 @@ static int xfade_frame(AVFilterContext *ctx, AVFrame *a, AVFrame *b) ff_filter_execute(ctx, xfade_slice, &td, NULL, FFMIN(outlink->h, ff_filter_get_nb_threads(ctx))); - out->pts = s->pts; + out->pts = pts; return ff_filter_frame(outlink, out); } @@ -2297,6 +2326,7 @@ static int xfade_activate(AVFilterContext *avctx) // We are transitioning, so we need a frame from second input if (ff_inlink_check_available_frame(in_b)) { int ret; + float progress; ff_inlink_consume_frame(avctx->inputs[0], &s->xf[0]); ff_inlink_consume_frame(avctx->inputs[1], &s->xf[1]); @@ -2311,7 +2341,8 @@ static int xfade_activate(AVFilterContext *avctx) ff_inlink_set_status(in_a, AVERROR_EOF); s->passthrough = 1; } - ret = xfade_frame(avctx, s->xf[0], s->xf[1]); + progress = av_clipf(1.f - ((float)(s->pts - s->start_pts) / s->duration_pts), 0.f, 1.f); + ret = xfade_frame(avctx, s->xf[0], s->xf[1], s->pts, progress); av_frame_free(&s->xf[0]); av_frame_free(&s->xf[1]); return ret; @@ -2349,6 +2380,110 @@ static int xfade_activate(AVFilterContext *avctx) return FFERROR_NOT_READY; } +/** + * Perform a transition between two points in the input video stream. + * Transition between the last frame before the specified start time to the first frame after the specified end time. + * All input frames between these points are discarded and replaced with new frames. + */ +static int tfade_activate(AVFilterContext *avctx) +{ + XFadeContext *s = avctx->priv; + AVFilterLink *in = avctx->inputs[0]; + AVFilterLink *outlink = avctx->outputs[0]; + int64_t status_pts; + AVFrame *frame; + AVFrame* end_frame; + int ret; + + FF_FILTER_FORWARD_STATUS_BACK_ALL(outlink, avctx); + + // We did not finish transitioning yet and the stream did not end either, so check if there are more frames to consume. + if (ff_inlink_check_available_frame(in)) { + AVFrame *peeked_frame = ff_inlink_peek_frame(in, 0); + s->pts = peeked_frame->pts; + + // Initiliaze PTS values on first call. + if (s->start_pts == AV_NOPTS_VALUE) { + s->start_pts = s->pts + av_rescale_q(s->start, AV_TIME_BASE_Q, in->time_base); + s->end_pts = s->pts + av_rescale_q(s->end, AV_TIME_BASE_Q, in->time_base); + s->frame_duration = av_rescale_q(1, av_inv_q(in->frame_rate), in->time_base); + av_log(avctx, AV_LOG_DEBUG, "tfade_activate(); start_pts=%ld, end_pt=%ld, frame_duration=%ld\n", s->start_pts, s->end_pts, s->frame_duration); + } + + if (s->pts < s->start_pts) { + s->passthrough = 1; + ff_inlink_consume_frame(in, &frame); + return ff_filter_frame(outlink, frame); + } else if (s->pts >= s->start_pts && s->pts < (s->start_pts + s->frame_duration)) { + av_log(avctx, AV_LOG_DEBUG, "tfade_activate(): start frame PTS=%ld\n", s->pts); + s->start_frame = av_frame_clone(peeked_frame); + + s->passthrough = 1; + ff_inlink_consume_frame(in, &frame); + return ff_filter_frame(outlink, frame); + } else if (s->pts > s->start_pts && s->pts < s->end_pts) { + // During transition just discard input frame. Count discarded frames so they can be replaced later + s->passthrough = 0; + s->nb_frames++; + + ff_inlink_consume_frame(in, &frame); + ff_inlink_request_frame(in); + return 0; + } else if (s->pts >= s->end_pts && s->pts < (s->end_pts + s->frame_duration)) { + ff_inlink_consume_frame(in, &end_frame); + s->nb_frames++; + + av_log(avctx, AV_LOG_DEBUG, "tfade_activate(): End frame PTS=%ld, Number of frames = %d\n", s->pts, s->nb_frames); + + // Replace discarded input frames with transition frames + for (int i = 0; i < s->nb_frames; i++) { + int64_t pts = s->start_pts + (s->frame_duration * i); + float progress = av_clipf(1.f - ((float)i / s->nb_frames), 0.f, 1.f); + ret = xfade_frame(avctx, s->start_frame, end_frame, pts, progress); + } + + av_frame_free(&s->start_frame); + av_frame_free(&end_frame); + + return ret; + } else { + // After the end transiton point just request and formward the input frame + s->passthrough = 1; + ff_inlink_consume_frame(in, &s->start_frame); + return ff_filter_frame(outlink, s->start_frame); + } + } + + // We did not get a frame from input, check its status. + if (ff_inlink_acknowledge_status(in, &s->status[0], &status_pts)) { + // Input is EOF so report EOF output. + ff_outlink_set_status(outlink, s->status[0], s->pts); + return 0; + } + + // We have no frames yet from input and no EOF, so request some. + if (ff_outlink_frame_wanted(outlink)) { + ff_inlink_request_frame(in); + return 0; + } + + return FFERROR_NOT_READY; +} + +/** + * Select between the single-stream tfade or dual-stream xfade. + */ +static int activate(AVFilterContext *avctx) +{ + XFadeContext *s = avctx->priv; + if (s->nb_inputs == 1) + return tfade_activate(avctx); + else if (s->nb_inputs == 2) + return xfade_activate(avctx); + else + return AVERROR_BUG; +} + static AVFrame *get_video_buffer(AVFilterLink *inlink, int w, int h) { XFadeContext *s = inlink->dst->priv; @@ -2358,18 +2493,36 @@ static AVFrame *get_video_buffer(AVFilterLink *inlink, int w, int h) ff_default_get_video_buffer(inlink, w, h); } -static const AVFilterPad xfade_inputs[] = { - { - .name = "main", - .type = AVMEDIA_TYPE_VIDEO, - .get_buffer.video = get_video_buffer, - }, - { - .name = "xfade", - .type = AVMEDIA_TYPE_VIDEO, - .get_buffer.video = get_video_buffer, - }, -}; +/** + * Setup the input pads depending on whether single or dual-stream inputs. + */ +static av_cold int init(AVFilterContext *avctx) +{ + XFadeContext *s = avctx->priv; + int ret; + + if (s->nb_inputs >= 1) { + AVFilterPad pad = { 0 }; + pad.name = "main"; + pad.type = AVMEDIA_TYPE_VIDEO; + pad.get_buffer.video = get_video_buffer; + if ((ret = ff_append_inpad(avctx, &pad)) < 0) + return ret; + + //avctx->filter->activate = tfade_activate; // Is there an API to assign activate function? + } + + if (s->nb_inputs == 2) { + AVFilterPad pad = { 0 }; + pad.name = "xfade"; + pad.type = AVMEDIA_TYPE_VIDEO; + pad.get_buffer.video = get_video_buffer; + if ((ret = ff_append_inpad(avctx, &pad)) < 0) + return ret; + } + + return 0; +} static const AVFilterPad xfade_outputs[] = { { @@ -2381,13 +2534,14 @@ static const AVFilterPad xfade_outputs[] = { const AVFilter ff_vf_xfade = { .name = "xfade", - .description = NULL_IF_CONFIG_SMALL("Cross fade one video with another video."), + .description = NULL_IF_CONFIG_SMALL("Cross fade one video with another video, or between two points in a single video."), .priv_size = sizeof(XFadeContext), .priv_class = &xfade_class, - .activate = xfade_activate, + .init = init, + .activate = activate, .uninit = uninit, - FILTER_INPUTS(xfade_inputs), + .inputs = NULL, FILTER_OUTPUTS(xfade_outputs), FILTER_PIXFMTS_ARRAY(pix_fmts), - .flags = AVFILTER_FLAG_SLICE_THREADS, + .flags = AVFILTER_FLAG_SLICE_THREADS | AVFILTER_FLAG_DYNAMIC_INPUTS, };