From patchwork Sun Aug 23 16:04:05 2020 Content-Type: text/plain; charset="utf-8" MIME-Version: 1.0 Content-Transfer-Encoding: 7bit X-Patchwork-Submitter: "Fu, Ting" X-Patchwork-Id: 21851 Return-Path: X-Original-To: patchwork@ffaux-bg.ffmpeg.org Delivered-To: patchwork@ffaux-bg.ffmpeg.org Received: from ffbox0-bg.mplayerhq.hu (ffbox0-bg.ffmpeg.org [79.124.17.100]) by ffaux.localdomain (Postfix) with ESMTP id D80AB4498AD for ; Sun, 23 Aug 2020 19:09:40 +0300 (EEST) Received: from [127.0.1.1] (localhost [127.0.0.1]) by ffbox0-bg.mplayerhq.hu (Postfix) with ESMTP id C1402688336; Sun, 23 Aug 2020 19:09:40 +0300 (EEST) X-Original-To: ffmpeg-devel@ffmpeg.org Delivered-To: ffmpeg-devel@ffmpeg.org Received: from mga02.intel.com (mga02.intel.com [134.134.136.20]) by ffbox0-bg.mplayerhq.hu (Postfix) with ESMTPS id 45513688056 for ; Sun, 23 Aug 2020 19:09:34 +0300 (EEST) IronPort-SDR: 2+jN5/jX9LfZLLqlZrYn9ptui8fR5iGRvoKCs929luG8Ba+gpx/I7hW5mh0dZNNVjF/EQMS+sD 9K+r5mH9ePfw== X-IronPort-AV: E=McAfee;i="6000,8403,9722"; a="143578967" X-IronPort-AV: E=Sophos;i="5.76,345,1592895600"; d="scan'208";a="143578967" X-Amp-Result: SKIPPED(no attachment in message) X-Amp-File-Uploaded: False Received: from fmsmga001.fm.intel.com ([10.253.24.23]) by orsmga101.jf.intel.com with ESMTP/TLS/ECDHE-RSA-AES256-GCM-SHA384; 23 Aug 2020 09:09:27 -0700 IronPort-SDR: 87PincRh5+UKDvm5ka2qHBsKNFjOhBiY6qjVqRTx0lJejwGBXG8VTQ2gVhZxp0dufhM1Jt2Vb4 rmkRB+eI727A== X-ExtLoop1: 1 X-IronPort-AV: E=Sophos;i="5.76,345,1592895600"; d="scan'208";a="401937368" Received: from semmer-ubuntu.sh.intel.com ([10.239.159.54]) by fmsmga001.fm.intel.com with ESMTP; 23 Aug 2020 09:09:26 -0700 From: Ting Fu To: ffmpeg-devel@ffmpeg.org Date: Mon, 24 Aug 2020 00:04:05 +0800 Message-Id: <20200823160405.22459-2-ting.fu@intel.com> X-Mailer: git-send-email 2.17.1 In-Reply-To: <20200823160405.22459-1-ting.fu@intel.com> References: <20200823160405.22459-1-ting.fu@intel.com> Subject: [FFmpeg-devel] [PATCH V4 2/2] dnn/native: add log error message X-BeenThere: ffmpeg-devel@ffmpeg.org X-Mailman-Version: 2.1.20 Precedence: list List-Id: FFmpeg development discussions and patches List-Unsubscribe: , List-Archive: List-Post: List-Help: List-Subscribe: , Reply-To: FFmpeg development discussions and patches MIME-Version: 1.0 Errors-To: ffmpeg-devel-bounces@ffmpeg.org Sender: "ffmpeg-devel" Signed-off-by: Ting Fu --- libavfilter/dnn/dnn_backend_native.c | 59 +++++++++++++++---- libavfilter/dnn/dnn_backend_native.h | 5 ++ .../dnn/dnn_backend_native_layer_avgpool.c | 10 +++- .../dnn/dnn_backend_native_layer_avgpool.h | 2 +- .../dnn/dnn_backend_native_layer_conv2d.c | 10 +++- .../dnn/dnn_backend_native_layer_conv2d.h | 2 +- .../dnn_backend_native_layer_depth2space.c | 10 +++- .../dnn_backend_native_layer_depth2space.h | 2 +- .../dnn/dnn_backend_native_layer_mathbinary.c | 11 +++- .../dnn/dnn_backend_native_layer_mathbinary.h | 2 +- .../dnn/dnn_backend_native_layer_mathunary.c | 11 +++- .../dnn/dnn_backend_native_layer_mathunary.h | 2 +- .../dnn/dnn_backend_native_layer_maximum.c | 10 +++- .../dnn/dnn_backend_native_layer_maximum.h | 2 +- .../dnn/dnn_backend_native_layer_pad.c | 10 +++- .../dnn/dnn_backend_native_layer_pad.h | 2 +- libavfilter/dnn/dnn_backend_native_layers.h | 2 +- tests/dnn/dnn-layer-avgpool-test.c | 4 +- tests/dnn/dnn-layer-conv2d-test.c | 4 +- tests/dnn/dnn-layer-depth2space-test.c | 2 +- tests/dnn/dnn-layer-mathbinary-test.c | 6 +- tests/dnn/dnn-layer-mathunary-test.c | 2 +- tests/dnn/dnn-layer-maximum-test.c | 2 +- tests/dnn/dnn-layer-pad-test.c | 6 +- 24 files changed, 125 insertions(+), 53 deletions(-) diff --git a/libavfilter/dnn/dnn_backend_native.c b/libavfilter/dnn/dnn_backend_native.c index 2f5095f2ee..0cb764686d 100644 --- a/libavfilter/dnn/dnn_backend_native.c +++ b/libavfilter/dnn/dnn_backend_native.c @@ -28,15 +28,26 @@ #include "dnn_backend_native_layer_conv2d.h" #include "dnn_backend_native_layers.h" +static const AVClass dnn_native_class = { + .class_name = "dnn_native", + .item_name = av_default_item_name, + .option = NULL, + .version = LIBAVUTIL_VERSION_INT, + .category = AV_CLASS_CATEGORY_FILTER, +}; + static DNNReturnType get_input_native(void *model, DNNData *input, const char *input_name) { NativeModel *native_model = (NativeModel *)model; + NativeContext *ctx = &native_model->ctx; for (int i = 0; i < native_model->operands_num; ++i) { DnnOperand *oprd = &native_model->operands[i]; if (strcmp(oprd->name, input_name) == 0) { - if (oprd->type != DOT_INPUT) + if (oprd->type != DOT_INPUT) { + av_log(ctx, AV_LOG_ERROR, "Found \"%s\" in model, but it is not input node\n", input_name); return DNN_ERROR; + } input->dt = oprd->data_type; av_assert0(oprd->dims[0] == 1); input->height = oprd->dims[1]; @@ -47,30 +58,37 @@ static DNNReturnType get_input_native(void *model, DNNData *input, const char *i } // do not find the input operand + av_log(ctx, AV_LOG_ERROR, "Could not find \"%s\" in model\n", input_name); return DNN_ERROR; } static DNNReturnType set_input_output_native(void *model, DNNData *input, const char *input_name, const char **output_names, uint32_t nb_output) { NativeModel *native_model = (NativeModel *)model; + NativeContext *ctx = &native_model->ctx; DnnOperand *oprd = NULL; - if (native_model->layers_num <= 0 || native_model->operands_num <= 0) + if (native_model->layers_num <= 0 || native_model->operands_num <= 0) { + av_log(ctx, AV_LOG_ERROR, "No operands or layers in model\n"); return DNN_ERROR; + } /* inputs */ for (int i = 0; i < native_model->operands_num; ++i) { oprd = &native_model->operands[i]; if (strcmp(oprd->name, input_name) == 0) { - if (oprd->type != DOT_INPUT) + if (oprd->type != DOT_INPUT) { + av_log(ctx, AV_LOG_ERROR, "Found \"%s\" in model, but it is not input node\n", input_name); return DNN_ERROR; + } break; } oprd = NULL; } - - if (!oprd) + if (!oprd) { + av_log(ctx, AV_LOG_ERROR, "Could not find \"%s\" in model\n", input_name); return DNN_ERROR; + } oprd->dims[0] = 1; oprd->dims[1] = input->height; @@ -79,11 +97,15 @@ static DNNReturnType set_input_output_native(void *model, DNNData *input, const av_freep(&oprd->data); oprd->length = calculate_operand_data_length(oprd); - if (oprd->length <= 0) + if (oprd->length <= 0) { + av_log(ctx, AV_LOG_ERROR, "The input data length overflow\n"); return DNN_ERROR; + } oprd->data = av_malloc(oprd->length); - if (!oprd->data) + if (!oprd->data) { + av_log(ctx, AV_LOG_ERROR, "Failed to malloc memory for input data\n"); return DNN_ERROR; + } input->data = oprd->data; @@ -91,8 +113,10 @@ static DNNReturnType set_input_output_native(void *model, DNNData *input, const native_model->nb_output = 0; av_freep(&native_model->output_indexes); native_model->output_indexes = av_mallocz_array(nb_output, sizeof(*native_model->output_indexes)); - if (!native_model->output_indexes) + if (!native_model->output_indexes) { + av_log(ctx, AV_LOG_ERROR, "Failed to malloc memory for output\n"); return DNN_ERROR; + } for (uint32_t i = 0; i < nb_output; ++i) { const char *output_name = output_names[i]; @@ -105,8 +129,10 @@ static DNNReturnType set_input_output_native(void *model, DNNData *input, const } } - if (native_model->nb_output != nb_output) + if (native_model->nb_output != nb_output) { + av_log(ctx, AV_LOG_ERROR, "Output(s) name are not all set correctly\n"); return DNN_ERROR; + } return DNN_SUCCESS; } @@ -171,6 +197,8 @@ DNNModel *ff_dnn_load_model_native(const char *model_filename, const char *optio if (!native_model){ goto fail; } + + native_model->ctx.class = &dnn_native_class; model->model = (void *)native_model; avio_seek(model_file_context, file_size - 8, SEEK_SET); @@ -258,20 +286,27 @@ fail: DNNReturnType ff_dnn_execute_model_native(const DNNModel *model, DNNData *outputs, uint32_t nb_output) { NativeModel *native_model = (NativeModel *)model->model; + NativeContext *ctx = &native_model->ctx; int32_t layer; uint32_t nb = FFMIN(nb_output, native_model->nb_output); - if (native_model->layers_num <= 0 || native_model->operands_num <= 0) + if (native_model->layers_num <= 0 || native_model->operands_num <= 0) { + av_log(ctx, AV_LOG_ERROR, "No operands or layers in model\n"); return DNN_ERROR; - if (!native_model->operands[0].data) + } + if (!native_model->operands[0].data) { + av_log(ctx, AV_LOG_ERROR, "Empty model input data\n"); return DNN_ERROR; + } for (layer = 0; layer < native_model->layers_num; ++layer){ DNNLayerType layer_type = native_model->layers[layer].type; if (layer_funcs[layer_type].pf_exec(native_model->operands, native_model->layers[layer].input_operand_indexes, native_model->layers[layer].output_operand_index, - native_model->layers[layer].params) == DNN_ERROR) { + native_model->layers[layer].params, + &native_model->ctx) == DNN_ERROR) { + av_log(ctx, AV_LOG_ERROR, "Failed to execuet model\n"); return DNN_ERROR; } } diff --git a/libavfilter/dnn/dnn_backend_native.h b/libavfilter/dnn/dnn_backend_native.h index 228d5b742b..3de4a11050 100644 --- a/libavfilter/dnn/dnn_backend_native.h +++ b/libavfilter/dnn/dnn_backend_native.h @@ -106,8 +106,13 @@ typedef struct InputParams{ int height, width, channels; } InputParams; +typedef struct NativeContext { + const AVClass *class; +} NativeContext; + // Represents simple feed-forward convolutional network. typedef struct NativeModel{ + NativeContext ctx; Layer *layers; int32_t layers_num; DnnOperand *operands; diff --git a/libavfilter/dnn/dnn_backend_native_layer_avgpool.c b/libavfilter/dnn/dnn_backend_native_layer_avgpool.c index bd7bdb4c97..989006d797 100644 --- a/libavfilter/dnn/dnn_backend_native_layer_avgpool.c +++ b/libavfilter/dnn/dnn_backend_native_layer_avgpool.c @@ -56,7 +56,7 @@ int dnn_load_layer_avg_pool(Layer *layer, AVIOContext *model_file_context, int f } int dnn_execute_layer_avg_pool(DnnOperand *operands, const int32_t *input_operand_indexes, - int32_t output_operand_index, const void *parameters) + int32_t output_operand_index, const void *parameters, NativeContext *ctx) { float *output; int height_end, width_end, height_radius, width_radius, output_height, output_width, kernel_area; @@ -107,9 +107,15 @@ int dnn_execute_layer_avg_pool(DnnOperand *operands, const int32_t *input_operan output_operand->dims[3] = channel; output_operand->data_type = operands[input_operand_index].data_type; output_operand->length = calculate_operand_data_length(output_operand); + if (output_operand->length <= 0) { + av_log(ctx, AV_LOG_ERROR, "The output data length overflow\n"); + return DNN_ERROR; + } output_operand->data = av_realloc(output_operand->data, output_operand->length); - if (!output_operand->data) + if (!output_operand->data) { + av_log(ctx, AV_LOG_ERROR, "Failed to reallocate memory for output\n"); return DNN_ERROR; + } output = output_operand->data; for (int y = 0; y < height_end; y += kernel_strides) { diff --git a/libavfilter/dnn/dnn_backend_native_layer_avgpool.h b/libavfilter/dnn/dnn_backend_native_layer_avgpool.h index 8e31ddb7c8..543370ff3b 100644 --- a/libavfilter/dnn/dnn_backend_native_layer_avgpool.h +++ b/libavfilter/dnn/dnn_backend_native_layer_avgpool.h @@ -35,6 +35,6 @@ typedef struct AvgPoolParams{ int dnn_load_layer_avg_pool(Layer *layer, AVIOContext *model_file_context, int file_size, int operands_num); int dnn_execute_layer_avg_pool(DnnOperand *operands, const int32_t *input_operand_indexes, - int32_t output_operand_index, const void *parameters); + int32_t output_operand_index, const void *parameters, NativeContext *ctx); #endif diff --git a/libavfilter/dnn/dnn_backend_native_layer_conv2d.c b/libavfilter/dnn/dnn_backend_native_layer_conv2d.c index 25356901c2..d079795bf8 100644 --- a/libavfilter/dnn/dnn_backend_native_layer_conv2d.c +++ b/libavfilter/dnn/dnn_backend_native_layer_conv2d.c @@ -89,7 +89,7 @@ int dnn_load_layer_conv2d(Layer *layer, AVIOContext *model_file_context, int fil } int dnn_execute_layer_conv2d(DnnOperand *operands, const int32_t *input_operand_indexes, - int32_t output_operand_index, const void *parameters) + int32_t output_operand_index, const void *parameters, NativeContext *ctx) { float *output; int32_t input_operand_index = input_operand_indexes[0]; @@ -113,11 +113,15 @@ int dnn_execute_layer_conv2d(DnnOperand *operands, const int32_t *input_operand_ output_operand->dims[3] = conv_params->output_num; output_operand->data_type = operands[input_operand_index].data_type; output_operand->length = calculate_operand_data_length(output_operand); - if (output_operand->length <= 0) + if (output_operand->length <= 0) { + av_log(ctx, AV_LOG_ERROR, "The output data length overflow\n"); return DNN_ERROR; + } output_operand->data = av_realloc(output_operand->data, output_operand->length); - if (!output_operand->data) + if (!output_operand->data) { + av_log(ctx, AV_LOG_ERROR, "Failed to reallocate memory for output\n"); return DNN_ERROR; + } output = output_operand->data; av_assert0(channel == conv_params->input_num); diff --git a/libavfilter/dnn/dnn_backend_native_layer_conv2d.h b/libavfilter/dnn/dnn_backend_native_layer_conv2d.h index b240b7ef6b..72319f2ebe 100644 --- a/libavfilter/dnn/dnn_backend_native_layer_conv2d.h +++ b/libavfilter/dnn/dnn_backend_native_layer_conv2d.h @@ -37,5 +37,5 @@ typedef struct ConvolutionalParams{ int dnn_load_layer_conv2d(Layer *layer, AVIOContext *model_file_context, int file_size, int operands_num); int dnn_execute_layer_conv2d(DnnOperand *operands, const int32_t *input_operand_indexes, - int32_t output_operand_index, const void *parameters); + int32_t output_operand_index, const void *parameters, NativeContext *ctx); #endif diff --git a/libavfilter/dnn/dnn_backend_native_layer_depth2space.c b/libavfilter/dnn/dnn_backend_native_layer_depth2space.c index 5a61025f7a..4107ee6cae 100644 --- a/libavfilter/dnn/dnn_backend_native_layer_depth2space.c +++ b/libavfilter/dnn/dnn_backend_native_layer_depth2space.c @@ -50,7 +50,7 @@ int dnn_load_layer_depth2space(Layer *layer, AVIOContext *model_file_context, in } int dnn_execute_layer_depth2space(DnnOperand *operands, const int32_t *input_operand_indexes, - int32_t output_operand_index, const void *parameters) + int32_t output_operand_index, const void *parameters, NativeContext *ctx) { float *output; const DepthToSpaceParams *params = (const DepthToSpaceParams *)parameters; @@ -75,11 +75,15 @@ int dnn_execute_layer_depth2space(DnnOperand *operands, const int32_t *input_ope output_operand->dims[3] = new_channels; output_operand->data_type = operands[input_operand_index].data_type; output_operand->length = calculate_operand_data_length(output_operand); - if (output_operand->length <= 0) + if (output_operand->length <= 0) { + av_log(ctx, AV_LOG_ERROR, "The output data length overflow\n"); return DNN_ERROR; + } output_operand->data = av_realloc(output_operand->data, output_operand->length); - if (!output_operand->data) + if (!output_operand->data) { + av_log(ctx, AV_LOG_ERROR, "Failed to reallocate memory for output\n"); return DNN_ERROR; + } output = output_operand->data; for (y = 0; y < height; ++y){ diff --git a/libavfilter/dnn/dnn_backend_native_layer_depth2space.h b/libavfilter/dnn/dnn_backend_native_layer_depth2space.h index b2901e0141..648a927f2d 100644 --- a/libavfilter/dnn/dnn_backend_native_layer_depth2space.h +++ b/libavfilter/dnn/dnn_backend_native_layer_depth2space.h @@ -36,6 +36,6 @@ typedef struct DepthToSpaceParams{ int dnn_load_layer_depth2space(Layer *layer, AVIOContext *model_file_context, int file_size, int operands_num); int dnn_execute_layer_depth2space(DnnOperand *operands, const int32_t *input_operand_indexes, - int32_t output_operand_index, const void *parameters); + int32_t output_operand_index, const void *parameters, NativeContext *ctx); #endif diff --git a/libavfilter/dnn/dnn_backend_native_layer_mathbinary.c b/libavfilter/dnn/dnn_backend_native_layer_mathbinary.c index bffa41cdda..081a70d23c 100644 --- a/libavfilter/dnn/dnn_backend_native_layer_mathbinary.c +++ b/libavfilter/dnn/dnn_backend_native_layer_mathbinary.c @@ -77,7 +77,7 @@ int dnn_load_layer_math_binary(Layer *layer, AVIOContext *model_file_context, in } int dnn_execute_layer_math_binary(DnnOperand *operands, const int32_t *input_operand_indexes, - int32_t output_operand_index, const void *parameters) + int32_t output_operand_index, const void *parameters, NativeContext *ctx) { const DnnOperand *input = &operands[input_operand_indexes[0]]; DnnOperand *output = &operands[output_operand_index]; @@ -91,11 +91,15 @@ int dnn_execute_layer_math_binary(DnnOperand *operands, const int32_t *input_ope output->data_type = input->data_type; output->length = calculate_operand_data_length(output); - if (output->length <= 0) + if (output->length <= 0) { + av_log(ctx, AV_LOG_ERROR, "The output data length overflow\n"); return DNN_ERROR; + } output->data = av_realloc(output->data, output->length); - if (!output->data) + if (!output->data) { + av_log(ctx, AV_LOG_ERROR, "Failed to reallocate memory for output\n"); return DNN_ERROR; + } dims_count = calculate_operand_dims_count(output); src = input->data; @@ -176,6 +180,7 @@ int dnn_execute_layer_math_binary(DnnOperand *operands, const int32_t *input_ope } return 0; default: + av_log(ctx, AV_LOG_ERROR, "Unmatch math binary operator\n"); return DNN_ERROR; } } diff --git a/libavfilter/dnn/dnn_backend_native_layer_mathbinary.h b/libavfilter/dnn/dnn_backend_native_layer_mathbinary.h index 0acf3b0ea0..719faa8030 100644 --- a/libavfilter/dnn/dnn_backend_native_layer_mathbinary.h +++ b/libavfilter/dnn/dnn_backend_native_layer_mathbinary.h @@ -48,6 +48,6 @@ typedef struct DnnLayerMathBinaryParams{ int dnn_load_layer_math_binary(Layer *layer, AVIOContext *model_file_context, int file_size, int operands_num); int dnn_execute_layer_math_binary(DnnOperand *operands, const int32_t *input_operand_indexes, - int32_t output_operand_index, const void *parameters); + int32_t output_operand_index, const void *parameters, NativeContext *ctx); #endif diff --git a/libavfilter/dnn/dnn_backend_native_layer_mathunary.c b/libavfilter/dnn/dnn_backend_native_layer_mathunary.c index 57bbd9d3e8..ae5d4daae9 100644 --- a/libavfilter/dnn/dnn_backend_native_layer_mathunary.c +++ b/libavfilter/dnn/dnn_backend_native_layer_mathunary.c @@ -53,7 +53,7 @@ int dnn_load_layer_math_unary(Layer *layer, AVIOContext *model_file_context, int } int dnn_execute_layer_math_unary(DnnOperand *operands, const int32_t *input_operand_indexes, - int32_t output_operand_index, const void *parameters) + int32_t output_operand_index, const void *parameters, NativeContext *ctx) { const DnnOperand *input = &operands[input_operand_indexes[0]]; DnnOperand *output = &operands[output_operand_index]; @@ -67,11 +67,15 @@ int dnn_execute_layer_math_unary(DnnOperand *operands, const int32_t *input_oper output->data_type = input->data_type; output->length = calculate_operand_data_length(output); - if (output->length <= 0) + if (output->length <= 0) { + av_log(ctx, AV_LOG_ERROR, "The output data length overflow\n"); return DNN_ERROR; + } output->data = av_realloc(output->data, output->length); - if (!output->data) + if (!output->data) { + av_log(ctx, AV_LOG_ERROR, "Failed to reallocate memory for output\n"); return DNN_ERROR; + } dims_count = calculate_operand_dims_count(output); src = input->data; @@ -143,6 +147,7 @@ int dnn_execute_layer_math_unary(DnnOperand *operands, const int32_t *input_oper dst[i] = round(src[i]); return 0; default: + av_log(ctx, AV_LOG_ERROR, "Unmatch math unary operator\n"); return DNN_ERROR; } } diff --git a/libavfilter/dnn/dnn_backend_native_layer_mathunary.h b/libavfilter/dnn/dnn_backend_native_layer_mathunary.h index d6a61effd5..301d02e5fb 100644 --- a/libavfilter/dnn/dnn_backend_native_layer_mathunary.h +++ b/libavfilter/dnn/dnn_backend_native_layer_mathunary.h @@ -55,6 +55,6 @@ typedef struct DnnLayerMathUnaryParams{ int dnn_load_layer_math_unary(Layer *layer, AVIOContext *model_file_context, int file_size, int operands_num); int dnn_execute_layer_math_unary(DnnOperand *operands, const int32_t *input_operand_indexes, - int32_t output_operand_index, const void *parameters); + int32_t output_operand_index, const void *parameters, NativeContext *ctx); #endif diff --git a/libavfilter/dnn/dnn_backend_native_layer_maximum.c b/libavfilter/dnn/dnn_backend_native_layer_maximum.c index cdddfdd87b..7ad5a22969 100644 --- a/libavfilter/dnn/dnn_backend_native_layer_maximum.c +++ b/libavfilter/dnn/dnn_backend_native_layer_maximum.c @@ -50,7 +50,7 @@ int dnn_load_layer_maximum(Layer *layer, AVIOContext *model_file_context, int fi } int dnn_execute_layer_maximum(DnnOperand *operands, const int32_t *input_operand_indexes, - int32_t output_operand_index, const void *parameters) + int32_t output_operand_index, const void *parameters, NativeContext *ctx) { const DnnOperand *input = &operands[input_operand_indexes[0]]; DnnOperand *output = &operands[output_operand_index]; @@ -64,11 +64,15 @@ int dnn_execute_layer_maximum(DnnOperand *operands, const int32_t *input_operand output->data_type = input->data_type; output->length = calculate_operand_data_length(output); - if (output->length <= 0) + if (output->length <= 0) { + av_log(ctx, AV_LOG_ERROR, "The output data length overflow\n"); return DNN_ERROR; + } output->data = av_realloc(output->data, output->length); - if (!output->data) + if (!output->data) { + av_log(ctx, AV_LOG_ERROR, "Failed to reallocate memory for output\n"); return DNN_ERROR; + } dims_count = calculate_operand_dims_count(output); src = input->data; diff --git a/libavfilter/dnn/dnn_backend_native_layer_maximum.h b/libavfilter/dnn/dnn_backend_native_layer_maximum.h index c049c63fd8..be63a3ab5b 100644 --- a/libavfilter/dnn/dnn_backend_native_layer_maximum.h +++ b/libavfilter/dnn/dnn_backend_native_layer_maximum.h @@ -39,6 +39,6 @@ typedef struct DnnLayerMaximumParams{ int dnn_load_layer_maximum(Layer *layer, AVIOContext *model_file_context, int file_size, int operands_num); int dnn_execute_layer_maximum(DnnOperand *operands, const int32_t *input_operand_indexes, - int32_t output_operand_index, const void *parameters); + int32_t output_operand_index, const void *parameters, NativeContext *ctx); #endif diff --git a/libavfilter/dnn/dnn_backend_native_layer_pad.c b/libavfilter/dnn/dnn_backend_native_layer_pad.c index 5452d22878..05892d43f4 100644 --- a/libavfilter/dnn/dnn_backend_native_layer_pad.c +++ b/libavfilter/dnn/dnn_backend_native_layer_pad.c @@ -76,7 +76,7 @@ static int after_get_buddy(int given, int border, LayerPadModeParam mode) } int dnn_execute_layer_pad(DnnOperand *operands, const int32_t *input_operand_indexes, - int32_t output_operand_index, const void *parameters) + int32_t output_operand_index, const void *parameters, NativeContext *ctx) { int32_t before_paddings; int32_t after_paddings; @@ -111,11 +111,15 @@ int dnn_execute_layer_pad(DnnOperand *operands, const int32_t *input_operand_ind output_operand->dims[3] = new_channel; output_operand->data_type = operands[input_operand_index].data_type; output_operand->length = calculate_operand_data_length(output_operand); - if (output_operand->length <= 0) + if (output_operand->length <= 0) { + av_log(ctx, AV_LOG_ERROR, "The output data length overflow\n"); return DNN_ERROR; + } output_operand->data = av_realloc(output_operand->data, output_operand->length); - if (!output_operand->data) + if (!output_operand->data) { + av_log(ctx, AV_LOG_ERROR, "Failed to reallocate memory for output\n"); return DNN_ERROR; + } output = output_operand->data; // copy the original data diff --git a/libavfilter/dnn/dnn_backend_native_layer_pad.h b/libavfilter/dnn/dnn_backend_native_layer_pad.h index 18e05bdd5c..6c69211824 100644 --- a/libavfilter/dnn/dnn_backend_native_layer_pad.h +++ b/libavfilter/dnn/dnn_backend_native_layer_pad.h @@ -38,6 +38,6 @@ typedef struct LayerPadParams{ int dnn_load_layer_pad(Layer *layer, AVIOContext *model_file_context, int file_size, int operands_num); int dnn_execute_layer_pad(DnnOperand *operands, const int32_t *input_operand_indexes, - int32_t output_operand_index, const void *parameters); + int32_t output_operand_index, const void *parameters, NativeContext *ctx); #endif diff --git a/libavfilter/dnn/dnn_backend_native_layers.h b/libavfilter/dnn/dnn_backend_native_layers.h index b696e9c6fa..dc76ace65a 100644 --- a/libavfilter/dnn/dnn_backend_native_layers.h +++ b/libavfilter/dnn/dnn_backend_native_layers.h @@ -25,7 +25,7 @@ #include "dnn_backend_native.h" typedef int (*LAYER_EXEC_FUNC)(DnnOperand *operands, const int32_t *input_operand_indexes, - int32_t output_operand_index, const void *parameters); + int32_t output_operand_index, const void *parameters, NativeContext *ctx); typedef int (*LAYER_LOAD_FUNC)(Layer *layer, AVIOContext *model_file_context, int file_size, int operands_num); typedef struct LayerFunc { diff --git a/tests/dnn/dnn-layer-avgpool-test.c b/tests/dnn/dnn-layer-avgpool-test.c index d7c33a0e88..0e6be8ba57 100644 --- a/tests/dnn/dnn-layer-avgpool-test.c +++ b/tests/dnn/dnn-layer-avgpool-test.c @@ -91,7 +91,7 @@ static int test_with_same(void) operands[1].data = NULL; input_indexes[0] = 0; - dnn_execute_layer_avg_pool(operands, input_indexes, 1, ¶ms); + dnn_execute_layer_avg_pool(operands, input_indexes, 1, ¶ms, NULL); output = operands[1].data; for (int i = 0; i < sizeof(expected_output) / sizeof(float); ++i) { @@ -171,7 +171,7 @@ static int test_with_valid(void) operands[1].data = NULL; input_indexes[0] = 0; - dnn_execute_layer_avg_pool(operands, input_indexes, 1, ¶ms); + dnn_execute_layer_avg_pool(operands, input_indexes, 1, ¶ms, NULL); output = operands[1].data; for (int i = 0; i < sizeof(expected_output) / sizeof(float); ++i) { diff --git a/tests/dnn/dnn-layer-conv2d-test.c b/tests/dnn/dnn-layer-conv2d-test.c index 2da01e5372..836839cc64 100644 --- a/tests/dnn/dnn-layer-conv2d-test.c +++ b/tests/dnn/dnn-layer-conv2d-test.c @@ -114,7 +114,7 @@ static int test_with_same_dilate(void) operands[1].data = NULL; input_indexes[0] = 0; - dnn_execute_layer_conv2d(operands, input_indexes, 1, ¶ms); + dnn_execute_layer_conv2d(operands, input_indexes, 1, ¶ms, NULL); output = operands[1].data; for (int i = 0; i < sizeof(expected_output) / sizeof(float); i++) { @@ -214,7 +214,7 @@ static int test_with_valid(void) operands[1].data = NULL; input_indexes[0] = 0; - dnn_execute_layer_conv2d(operands, input_indexes, 1, ¶ms); + dnn_execute_layer_conv2d(operands, input_indexes, 1, ¶ms, NULL); output = operands[1].data; for (int i = 0; i < sizeof(expected_output) / sizeof(float); i++) { diff --git a/tests/dnn/dnn-layer-depth2space-test.c b/tests/dnn/dnn-layer-depth2space-test.c index 5225ec7b7a..2c641884c1 100644 --- a/tests/dnn/dnn-layer-depth2space-test.c +++ b/tests/dnn/dnn-layer-depth2space-test.c @@ -81,7 +81,7 @@ static int test(void) input_indexes[0] = 0; params.block_size = 2; - dnn_execute_layer_depth2space(operands, input_indexes, 1, ¶ms); + dnn_execute_layer_depth2space(operands, input_indexes, 1, ¶ms, NULL); output = operands[1].data; for (int i = 0; i < sizeof(expected_output) / sizeof(float); i++) { diff --git a/tests/dnn/dnn-layer-mathbinary-test.c b/tests/dnn/dnn-layer-mathbinary-test.c index e7f8f8557c..187a2b9bf2 100644 --- a/tests/dnn/dnn-layer-mathbinary-test.c +++ b/tests/dnn/dnn-layer-mathbinary-test.c @@ -69,7 +69,7 @@ static int test_broadcast_input0(DNNMathBinaryOperation op) operands[1].data = NULL; input_indexes[0] = 0; - dnn_execute_layer_math_binary(operands, input_indexes, 1, ¶ms); + dnn_execute_layer_math_binary(operands, input_indexes, 1, ¶ms, NULL); output = operands[1].data; for (int i = 0; i < sizeof(input) / sizeof(float); i++) { @@ -109,7 +109,7 @@ static int test_broadcast_input1(DNNMathBinaryOperation op) operands[1].data = NULL; input_indexes[0] = 0; - dnn_execute_layer_math_binary(operands, input_indexes, 1, ¶ms); + dnn_execute_layer_math_binary(operands, input_indexes, 1, ¶ms, NULL); output = operands[1].data; for (int i = 0; i < sizeof(input) / sizeof(float); i++) { @@ -157,7 +157,7 @@ static int test_no_broadcast(DNNMathBinaryOperation op) input_indexes[0] = 0; input_indexes[1] = 1; - dnn_execute_layer_math_binary(operands, input_indexes, 2, ¶ms); + dnn_execute_layer_math_binary(operands, input_indexes, 2, ¶ms, NULL); output = operands[2].data; for (int i = 0; i < sizeof(input0) / sizeof(float); i++) { diff --git a/tests/dnn/dnn-layer-mathunary-test.c b/tests/dnn/dnn-layer-mathunary-test.c index e9235120f3..ce14c41311 100644 --- a/tests/dnn/dnn-layer-mathunary-test.c +++ b/tests/dnn/dnn-layer-mathunary-test.c @@ -87,7 +87,7 @@ static int test(DNNMathUnaryOperation op) operands[1].data = NULL; input_indexes[0] = 0; - dnn_execute_layer_math_unary(operands, input_indexes, 1, ¶ms); + dnn_execute_layer_math_unary(operands, input_indexes, 1, ¶ms, NULL); output = operands[1].data; for (int i = 0; i < sizeof(input) / sizeof(float); ++i) { diff --git a/tests/dnn/dnn-layer-maximum-test.c b/tests/dnn/dnn-layer-maximum-test.c index 06daf64481..c982670591 100644 --- a/tests/dnn/dnn-layer-maximum-test.c +++ b/tests/dnn/dnn-layer-maximum-test.c @@ -45,7 +45,7 @@ static int test(void) operands[1].data = NULL; input_indexes[0] = 0; - dnn_execute_layer_maximum(operands, input_indexes, 1, ¶ms); + dnn_execute_layer_maximum(operands, input_indexes, 1, ¶ms, NULL); output = operands[1].data; for (int i = 0; i < sizeof(input) / sizeof(float); i++) { diff --git a/tests/dnn/dnn-layer-pad-test.c b/tests/dnn/dnn-layer-pad-test.c index ea8c824d1e..6a72adb3ae 100644 --- a/tests/dnn/dnn-layer-pad-test.c +++ b/tests/dnn/dnn-layer-pad-test.c @@ -79,7 +79,7 @@ static int test_with_mode_symmetric(void) operands[1].data = NULL; input_indexes[0] = 0; - dnn_execute_layer_pad(operands, input_indexes, 1, ¶ms); + dnn_execute_layer_pad(operands, input_indexes, 1, ¶ms, NULL); output = operands[1].data; for (int i = 0; i < sizeof(expected_output) / sizeof(float); i++) { @@ -144,7 +144,7 @@ static int test_with_mode_reflect(void) operands[1].data = NULL; input_indexes[0] = 0; - dnn_execute_layer_pad(operands, input_indexes, 1, ¶ms); + dnn_execute_layer_pad(operands, input_indexes, 1, ¶ms, NULL); output = operands[1].data; for (int i = 0; i < sizeof(expected_output) / sizeof(float); i++) { @@ -210,7 +210,7 @@ static int test_with_mode_constant(void) operands[1].data = NULL; input_indexes[0] = 0; - dnn_execute_layer_pad(operands, input_indexes, 1, ¶ms); + dnn_execute_layer_pad(operands, input_indexes, 1, ¶ms, NULL); output = operands[1].data; for (int i = 0; i < sizeof(expected_output) / sizeof(float); i++) {