KLI

LNFCOS: Efficient Object Detection through Deep Learning Based on LNblock

Metadata Downloads
Abstract
In recent deep-learning-based real-time object detection methods, the trade-off between accuracy and computational cost is an important consideration. Therefore, based on the fully convolutional one-stage detector (FCOS), which is a one-stage object detection method, we propose a light next FCOS (LNFCOS) that achieves an optimal trade-off between computational cost and accuracy. In LNFCOS, the loss of low- and high-level information is minimized by combining the features of different scales through the proposed feature fusion module. Moreover, the light next block (LNblock) is proposed for efficient feature extraction. LNblock performs feature extraction with a low computational cost compared with standard convolutions, through sequential operation on a small amount of spatial and channel information. To define the optimal parameters of LNFCOS suggested through experiments and for a fair comparison, experiments and evaluations were conducted on the publicly available benchmark datasets MSCOCO and PASCAL VOC. Additionally, the average precision (AP) was used as an evaluation index for quantitative evaluation. LNFCOS achieved an optimal trade-off between computational cost and accuracy by achieving a detection accuracy of 79.3 AP and 37.2 AP on the MS COCO and PASCAL VOC datasets, respectively, with 36% lower computational cost than the FCOS.
Author(s)
Beomyeon HwangSanghun LeeHyunho Han
Issued Date
2022
Type
Article
Keyword
convolution neural networksobject detectionFCOSattention methodLNblocklightweight
DOI
10.3390/electronics11172783
URI
https://oak.ulsan.ac.kr/handle/2021.oak/15324
Publisher
ELECTRONICS
Language
영어
ISSN
2079-9292
Citation Volume
11
Citation Number
17
Citation Start Page
1
Citation End Page
15
Appears in Collections:
Engineering > IT Convergence
공개 및 라이선스
  • 공개 구분공개
파일 목록
  • 관련 파일이 존재하지 않습니다.

Items in Repository are protected by copyright, with all rights reserved, unless otherwise indicated.