Application of Hyperspectral Imaging and Deep Learning for Robust Prediction of Sugar and pH Levels in Wine Grape Berries

Remote sensing technology, such as hyperspectral imaging, in combination with machine learning algorithms, has emerged as a viable tool for rapid and nondestructive assessment of wine grape ripeness. However, the differences in terroir, together with the climatic variations and the variability exhib...

Full description

Bibliographic Details
Main Authors: Véronique Gomes, Ana Mendes-Ferreira, Pedro Melo-Pinto
Format: Article
Language:English
Published: MDPI AG 2021-05-01
Series:Sensors
Subjects:
Online Access:https://www.mdpi.com/1424-8220/21/10/3459
Description
Summary:Remote sensing technology, such as hyperspectral imaging, in combination with machine learning algorithms, has emerged as a viable tool for rapid and nondestructive assessment of wine grape ripeness. However, the differences in terroir, together with the climatic variations and the variability exhibited by different grape varieties, have a considerable impact on the grape ripening stages within a vintage and between vintages and, consequently, on the robustness of the predictive models. To address this challenge, we present a novel one-dimensional convolutional neural network architecture-based model for the prediction of sugar content and pH, using reflectance hyperspectral data from different vintages. We aimed to evaluate the model’s generalization capacity for different varieties and for a different vintage not employed in the training process, using independent test sets. A transfer learning mechanism, based on the proposed convolutional neural network, was also used to evaluate improvements in the model’s generalization. Overall, the results for generalization ability showed a very good performance with RMSEP values of 1.118 °Brix and 1.085 °Brix for sugar content and 0.199 and 0.183 for pH, for test sets using different varieties and a different vintage, respectively, improving and updating the current state of the art.
ISSN:1424-8220