KLI

An Efficient Multi-view Facial Expression Classifier Implementing on Edge Device

Metadata Downloads
Abstract
The robotic technology demands human-robot interaction to implement a real-time facial emotion detector. This system has a role in recognizing the expressions of the user. Therefore, this application is recommended to work quickly to support the robot’s capabilities. It helps the robot to analyze the customer’s face effectively. However, the previous methods weakly recognize non-frontal faces. It is caused by the facial pose variations only to show partial facial features. This paper proposes a multi-view real-time facial emotion detector based on a lightweight convolutional neural network. It offers a four-stage backbone as an efficient feature extractor that discriminates specific facial components. The convolution with Cross Stage Partial (CSP) approach was employed to reduce computations from convolution operations. The attention module is inserted into the CSP block. These modules also support the detector to work speedily on edge devices. The classification system learns the information about facial features from the KDEF dataset. As a result, facial emotion recognition achieves comparative performance to other methods with an accuracy of 97.10% on the KDEF, 73.95 on the FER-2013, and 84.91% on the RAFDB dataset. The integrated system using a face detector shows that the system obtains a data processing speed of 30 frames per second on the Jetson Nano.
Author(s)
Muhamad Dwisnanto PutroDuy-Linh NguyenAdri PriadanaKang-Hyun Jo
Issued Date
2022
Type
Article
DOI
10.1007/978-981-19-8234-7_40
URI
https://oak.ulsan.ac.kr/handle/2021.oak/14912
Publisher
Communications in Computer and Information Science
Language
영어
ISSN
1865-0929
Citation Volume
1716
Citation Number
1
Citation Start Page
517
Citation End Page
529
Appears in Collections:
Medicine > Nursing
공개 및 라이선스
  • 공개 구분공개
파일 목록
  • 관련 파일이 존재하지 않습니다.

Items in Repository are protected by copyright, with all rights reserved, unless otherwise indicated.