Fast Depth Estimation in a Single Image Using Lightweight Efficient Neural Network
Depth estimation is a crucial and fundamental problem in the computer vision field. Conventional methods re-construct scenes using feature points extracted from multiple images; however, these approaches require multiple images and thus are not easily implemented in various real-time applications. M...
Main Authors: | , , |
---|---|
Format: | Article |
Language: | English |
Published: |
MDPI AG
2019-10-01
|
Series: | Sensors |
Subjects: | |
Online Access: | https://www.mdpi.com/1424-8220/19/20/4434 |
id |
doaj-ff8c15dcaf2c4b3ca1b430c642906a16 |
---|---|
record_format |
Article |
spelling |
doaj-ff8c15dcaf2c4b3ca1b430c642906a162020-11-25T01:14:08ZengMDPI AGSensors1424-82202019-10-011920443410.3390/s19204434s19204434Fast Depth Estimation in a Single Image Using Lightweight Efficient Neural NetworkSangwon Kim0Jaeyeal Nam1Byoungchul Ko2Department of Computer Engineering, Keimyung University, Daegu 42601, KoreaDepartment of Computer Engineering, Keimyung University, Daegu 42601, KoreaDepartment of Computer Engineering, Keimyung University, Daegu 42601, KoreaDepth estimation is a crucial and fundamental problem in the computer vision field. Conventional methods re-construct scenes using feature points extracted from multiple images; however, these approaches require multiple images and thus are not easily implemented in various real-time applications. Moreover, the special equipment required by hardware-based approaches using 3D sensors is expensive. Therefore, software-based methods for estimating depth from a single image using machine learning or deep learning are emerging as new alternatives. In this paper, we propose an algorithm that generates a depth map in real time using a single image and an optimized lightweight efficient neural network (L-ENet) algorithm instead of physical equipment, such as an infrared sensor or multi-view camera. Because depth values have a continuous nature and can produce locally ambiguous results, pixel-wise prediction with ordinal depth range classification was applied in this study. In addition, in our method various convolution techniques are applied to extract a dense feature map, and the number of parameters is greatly reduced by reducing the network layer. By using the proposed L-ENet algorithm, an accurate depth map can be generated from a single image quickly and, in a comparison with the ground truth, we can produce depth values closer to those of the ground truth with small errors. Experiments confirmed that the proposed L-ENet can achieve a significantly improved estimation performance over the state-of-the-art algorithms in depth estimation based on a single image.https://www.mdpi.com/1424-8220/19/20/4434depth estimationconvolutional neural networklightweight efficient neural networkmodelsingle imageordinal regression |
collection |
DOAJ |
language |
English |
format |
Article |
sources |
DOAJ |
author |
Sangwon Kim Jaeyeal Nam Byoungchul Ko |
spellingShingle |
Sangwon Kim Jaeyeal Nam Byoungchul Ko Fast Depth Estimation in a Single Image Using Lightweight Efficient Neural Network Sensors depth estimation convolutional neural network lightweight efficient neural network model single image ordinal regression |
author_facet |
Sangwon Kim Jaeyeal Nam Byoungchul Ko |
author_sort |
Sangwon Kim |
title |
Fast Depth Estimation in a Single Image Using Lightweight Efficient Neural Network |
title_short |
Fast Depth Estimation in a Single Image Using Lightweight Efficient Neural Network |
title_full |
Fast Depth Estimation in a Single Image Using Lightweight Efficient Neural Network |
title_fullStr |
Fast Depth Estimation in a Single Image Using Lightweight Efficient Neural Network |
title_full_unstemmed |
Fast Depth Estimation in a Single Image Using Lightweight Efficient Neural Network |
title_sort |
fast depth estimation in a single image using lightweight efficient neural network |
publisher |
MDPI AG |
series |
Sensors |
issn |
1424-8220 |
publishDate |
2019-10-01 |
description |
Depth estimation is a crucial and fundamental problem in the computer vision field. Conventional methods re-construct scenes using feature points extracted from multiple images; however, these approaches require multiple images and thus are not easily implemented in various real-time applications. Moreover, the special equipment required by hardware-based approaches using 3D sensors is expensive. Therefore, software-based methods for estimating depth from a single image using machine learning or deep learning are emerging as new alternatives. In this paper, we propose an algorithm that generates a depth map in real time using a single image and an optimized lightweight efficient neural network (L-ENet) algorithm instead of physical equipment, such as an infrared sensor or multi-view camera. Because depth values have a continuous nature and can produce locally ambiguous results, pixel-wise prediction with ordinal depth range classification was applied in this study. In addition, in our method various convolution techniques are applied to extract a dense feature map, and the number of parameters is greatly reduced by reducing the network layer. By using the proposed L-ENet algorithm, an accurate depth map can be generated from a single image quickly and, in a comparison with the ground truth, we can produce depth values closer to those of the ground truth with small errors. Experiments confirmed that the proposed L-ENet can achieve a significantly improved estimation performance over the state-of-the-art algorithms in depth estimation based on a single image. |
topic |
depth estimation convolutional neural network lightweight efficient neural network model single image ordinal regression |
url |
https://www.mdpi.com/1424-8220/19/20/4434 |
work_keys_str_mv |
AT sangwonkim fastdepthestimationinasingleimageusinglightweightefficientneuralnetwork AT jaeyealnam fastdepthestimationinasingleimageusinglightweightefficientneuralnetwork AT byoungchulko fastdepthestimationinasingleimageusinglightweightefficientneuralnetwork |
_version_ |
1725158583361863680 |