KLI

Optimal Camera Placement to Generate 3D Reconstruction of a Mixed-Reality Human in Real Environments

Metadata Downloads
Abstract
Virtual reality and augmented reality are increasingly used for immersive engagement by utilizing information from real environments. In particular, three-dimensional model data, which is the basis for creating virtual places, can be manually developed using commercial modeling toolkits, but with the advancement of sensing technology, computer vision technology can also be used to create virtual environments. Specifically, a 3D reconstruction approach can generate a single 3D model from image information obtained from various scenes in real environments using several cameras (multi-cameras). The goal is to generate a 3D model with excellent precision. However, the rules for choosing the optimal number of cameras and settings to capture information from in real environments (e.g., actual people) employing several cameras in unconventional positions are lacking. In this study, we propose an optimal camera placement strategy for acquiring high-quality 3D data using an irregular camera placement, essential for organizing image information while acquiring human data in a three-dimensional real space, using multiple irregular cameras in real environments. Our results show that installation costs can be lowered by arranging a minimum number of multi-camera cameras in an arbitrary space, and automated virtual human manufacturing with high accuracy can be conducted using optimal irregular camera location.
Issued Date
2023
Juhwan Kim
Dongsik Jo
Type
Article
Keyword
3D reconstructionoptimal camera placementmulti-camerasvirtual humanmixed reality
DOI
10.3390/electronics12204244
URI
https://oak.ulsan.ac.kr/handle/2021.oak/16953
Publisher
ELECTRONICS
Language
영어
ISSN
2079-9292
Citation Volume
12
Citation Number
20
Citation Start Page
2
Citation End Page
12
Appears in Collections:
Engineering > Engineering
공개 및 라이선스
  • 공개 구분공개
파일 목록
  • 관련 파일이 존재하지 않습니다.

Items in Repository are protected by copyright, with all rights reserved, unless otherwise indicated.