From patchwork Thu Aug 1 06:18:44 2019 Content-Type: text/plain; charset="utf-8" MIME-Version: 1.0 Content-Transfer-Encoding: 7bit X-Patchwork-Submitter: "Guo, Yejun" X-Patchwork-Id: 14177 Return-Path: X-Original-To: patchwork@ffaux-bg.ffmpeg.org Delivered-To: patchwork@ffaux-bg.ffmpeg.org Received: from ffbox0-bg.mplayerhq.hu (ffbox0-bg.ffmpeg.org [79.124.17.100]) by ffaux.localdomain (Postfix) with ESMTP id 43114447853 for ; Thu, 1 Aug 2019 09:22:04 +0300 (EEST) Received: from [127.0.1.1] (localhost [127.0.0.1]) by ffbox0-bg.mplayerhq.hu (Postfix) with ESMTP id 194C768A990; Thu, 1 Aug 2019 09:22:04 +0300 (EEST) X-Original-To: ffmpeg-devel@ffmpeg.org Delivered-To: ffmpeg-devel@ffmpeg.org Received: from mga09.intel.com (mga09.intel.com [134.134.136.24]) by ffbox0-bg.mplayerhq.hu (Postfix) with ESMTPS id E72CD68A8CA for ; Thu, 1 Aug 2019 09:21:57 +0300 (EEST) X-Amp-Result: SKIPPED(no attachment in message) X-Amp-File-Uploaded: False Received: from orsmga008.jf.intel.com ([10.7.209.65]) by orsmga102.jf.intel.com with ESMTP/TLS/DHE-RSA-AES256-GCM-SHA384; 31 Jul 2019 23:21:55 -0700 X-ExtLoop1: 1 X-IronPort-AV: E=Sophos;i="5.64,333,1559545200"; d="scan'208";a="166796934" Received: from yguo18-skl-u1604.sh.intel.com ([10.239.13.25]) by orsmga008.jf.intel.com with ESMTP; 31 Jul 2019 23:21:54 -0700 From: "Guo, Yejun" To: ffmpeg-devel@ffmpeg.org Date: Thu, 1 Aug 2019 14:18:44 +0800 Message-Id: <1564640324-759-1-git-send-email-yejun.guo@intel.com> X-Mailer: git-send-email 2.7.4 Subject: [FFmpeg-devel] [PATCH] dnn: rename function from dnn_execute_layer_pad to avfilter_dnn_execute_layer_pad X-BeenThere: ffmpeg-devel@ffmpeg.org X-Mailman-Version: 2.1.20 Precedence: list List-Id: FFmpeg development discussions and patches List-Unsubscribe: , List-Archive: List-Post: List-Help: List-Subscribe: , Reply-To: FFmpeg development discussions and patches Cc: yejun.guo@intel.com MIME-Version: 1.0 Errors-To: ffmpeg-devel-bounces@ffmpeg.org Sender: "ffmpeg-devel" background: DNN (deep neural network) is a sub module of libavfilter, and FATE/dnn is unit test for the DNN module, one unit test for one dnn layer. The unit tests are not based on the APIs exported by libavfilter, they just directly call into the functions within DNN submodule. (It is not easy for the test to call libavfilter APIs to verify dnn layers.) There is an issue when run the following command: build$ ../ffmpeg/configure --disable-static --enable-shared make make fate-dnn-layer-pad And part of error message: tests/dnn/dnn-layer-pad-test.o: In function `test_with_mode_symmetric': /work/media/ffmpeg/build/src/tests/dnn/dnn-layer-pad-test.c:73: undefined reference to `dnn_execute_layer_pad' The root cause is that function dnn_execute_layer_pad is a LOCAL symbol in libavfilter.so, and so the linker could not find it when build dnn-layer-pad-test. To check it, just run: readelf -s libavfilter/libavfilter.so | grep dnn There are three possible mothods to fix this issue. 1). delete the dnn unit tests from FATE. I don't think it is a good choice. 2). ask people to build static library if he wants to use FATE. 3). export the dnn layer functions. And it is what this patch for. Besides the function rename to export from .so library, we also need to add option -Wl,-rpath to build the dnn unit tests, so the path is written into the test binary and help to find the .so libraries during execution. Signed-off-by: Guo, Yejun --- libavfilter/dnn/dnn_backend_native.c | 2 +- libavfilter/dnn/dnn_backend_native_layer_pad.c | 2 +- libavfilter/dnn/dnn_backend_native_layer_pad.h | 2 +- tests/dnn/Makefile | 4 +++- tests/dnn/dnn-layer-pad-test.c | 6 +++--- 5 files changed, 9 insertions(+), 7 deletions(-) diff --git a/libavfilter/dnn/dnn_backend_native.c b/libavfilter/dnn/dnn_backend_native.c index 09c583b..fa412e0 100644 --- a/libavfilter/dnn/dnn_backend_native.c +++ b/libavfilter/dnn/dnn_backend_native.c @@ -377,7 +377,7 @@ DNNReturnType ff_dnn_execute_model_native(const DNNModel *model, DNNData *output break; case MIRROR_PAD: pad_params = (LayerPadParams *)network->layers[layer].params; - dnn_execute_layer_pad(network->layers[layer - 1].output, network->layers[layer].output, + avfilter_dnn_execute_layer_pad(network->layers[layer - 1].output, network->layers[layer].output, pad_params, 1, cur_height, cur_width, cur_channels); cur_height = cur_height + pad_params->paddings[1][0] + pad_params->paddings[1][1]; cur_width = cur_width + pad_params->paddings[2][0] + pad_params->paddings[2][1]; diff --git a/libavfilter/dnn/dnn_backend_native_layer_pad.c b/libavfilter/dnn/dnn_backend_native_layer_pad.c index 5417d73..526ddfe 100644 --- a/libavfilter/dnn/dnn_backend_native_layer_pad.c +++ b/libavfilter/dnn/dnn_backend_native_layer_pad.c @@ -48,7 +48,7 @@ static int after_get_buddy(int given, int border, LayerPadModeParam mode) } } -void dnn_execute_layer_pad(const float *input, float *output, const LayerPadParams *params, int number, int height, int width, int channel) +void avfilter_dnn_execute_layer_pad(const float *input, float *output, const LayerPadParams *params, int number, int height, int width, int channel) { int32_t before_paddings; int32_t after_paddings; diff --git a/libavfilter/dnn/dnn_backend_native_layer_pad.h b/libavfilter/dnn/dnn_backend_native_layer_pad.h index 0fbe652..3f93d6d 100644 --- a/libavfilter/dnn/dnn_backend_native_layer_pad.h +++ b/libavfilter/dnn/dnn_backend_native_layer_pad.h @@ -35,6 +35,6 @@ typedef struct LayerPadParams{ float constant_values; } LayerPadParams; -void dnn_execute_layer_pad(const float *input, float *output, const LayerPadParams *params, int number, int height, int width, int channel); +void avfilter_dnn_execute_layer_pad(const float *input, float *output, const LayerPadParams *params, int number, int height, int width, int channel); #endif diff --git a/tests/dnn/Makefile b/tests/dnn/Makefile index b2e6680..a34f2e7 100644 --- a/tests/dnn/Makefile +++ b/tests/dnn/Makefile @@ -4,8 +4,10 @@ DNNTESTOBJS := $(DNNTESTOBJS:%=$(DNNTESTSDIR)%) $(DNNTESTPROGS:%=$(DNNTESTSDIR) DNNTESTPROGS := $(DNNTESTPROGS:%=$(DNNTESTSDIR)/%-test$(EXESUF)) -include $(wildcard $(DNNTESTOBJS:.o=.d)) +LDDNNFLAGS = -Wl,-rpath=:libpostproc:libswresample:libswscale:libavfilter:libavdevice:libavformat:libavcodec:libavutil:libavresample + $(DNNTESTPROGS): %$(EXESUF): %.o $(FF_DEP_LIBS) - $(LD) $(LDFLAGS) $(LDEXEFLAGS) $(LD_O) $(filter %.o,$^) $(FF_EXTRALIBS) $(ELIBS) + $(LD) $(LDFLAGS) $(LDDNNFLAGS) $(LDEXEFLAGS) $(LD_O) $(filter %.o,$^) $(FF_EXTRALIBS) $(ELIBS) testclean:: $(RM) $(addprefix $(DNNTESTSDIR)/,$(CLEANSUFFIXES) *-test$(EXESUF)) diff --git a/tests/dnn/dnn-layer-pad-test.c b/tests/dnn/dnn-layer-pad-test.c index 28a49eb..078a2f4 100644 --- a/tests/dnn/dnn-layer-pad-test.c +++ b/tests/dnn/dnn-layer-pad-test.c @@ -70,7 +70,7 @@ static int test_with_mode_symmetric(void) params.paddings[3][0] = 0; params.paddings[3][1] = 0; - dnn_execute_layer_pad(input, output, ¶ms, 1, 4, 4, 3); + avfilter_dnn_execute_layer_pad(input, output, ¶ms, 1, 4, 4, 3); for (int i = 0; i < sizeof(output) / sizeof(float); i++) { if (fabs(output[i] - expected_output[i]) > EPSON) { @@ -123,7 +123,7 @@ static int test_with_mode_reflect(void) params.paddings[3][0] = 0; params.paddings[3][1] = 0; - dnn_execute_layer_pad(input, output, ¶ms, 3, 2, 2, 3); + avfilter_dnn_execute_layer_pad(input, output, ¶ms, 3, 2, 2, 3); for (int i = 0; i < sizeof(output) / sizeof(float); i++) { if (fabs(output[i] - expected_output[i]) > EPSON) { @@ -177,7 +177,7 @@ static int test_with_mode_constant(void) params.paddings[3][0] = 1; params.paddings[3][1] = 2; - dnn_execute_layer_pad(input, output, ¶ms, 1, 2, 2, 3); + avfilter_dnn_execute_layer_pad(input, output, ¶ms, 1, 2, 2, 3); for (int i = 0; i < sizeof(output) / sizeof(float); i++) { if (fabs(output[i] - expected_output[i]) > EPSON) {