RadNet 1.0: exploring deep learning architectures for longwave radiative transfer

<p>Simulating global and regional climate at high resolution is essential to study the effects of climate change and capture extreme events affecting human populations. To achieve this goal, the scalability of climate models and efficiency of individual model components are both important. Rad...

Full description

Bibliographic Details
Main Authors: Y. Liu, R. Caballero, J. M. Monteiro
Format: Article
Language:English
Published: Copernicus Publications 2020-09-01
Series:Geoscientific Model Development
Online Access:https://gmd.copernicus.org/articles/13/4399/2020/gmd-13-4399-2020.pdf
id doaj-1d76fcb412464aaf8079523d8af3ca5f
record_format Article
spelling doaj-1d76fcb412464aaf8079523d8af3ca5f2020-11-25T03:43:47ZengCopernicus PublicationsGeoscientific Model Development1991-959X1991-96032020-09-01134399441210.5194/gmd-13-4399-2020RadNet 1.0: exploring deep learning architectures for longwave radiative transferY. LiuR. CaballeroJ. M. Monteiro<p>Simulating global and regional climate at high resolution is essential to study the effects of climate change and capture extreme events affecting human populations. To achieve this goal, the scalability of climate models and efficiency of individual model components are both important. Radiative transfer is among the most computationally expensive components in a typical climate model. Here we attempt to model this component using a neural network. We aim to study the feasibility of replacing an explicit, physics-based computation of longwave radiative transfer by a neural network emulator and assessing the resultant performance gains. We compare multiple neural-network architectures, including a convolutional neural network, and our results suggest that the performance loss from the use of conventional convolutional networks is not offset by gains in accuracy. We train the networks with and without noise added to the input profiles and find that adding noise improves the ability of the networks to generalise beyond the training set. Prediction of radiative heating rates using our neural network models achieve up to 370<span class="inline-formula">×</span> speedup on a GTX 1080 GPU setup and 11<span class="inline-formula">×</span> speedup on a Xeon CPU setup compared to the a state-of-the-art radiative transfer library running on the same Xeon CPU. Furthermore, our neural network models yield less than 0.1&thinsp;K&thinsp;d<span class="inline-formula"><sup>−1</sup></span> mean squared error across all pressure levels. Upon introducing this component into a single-column model, we find that the time evolution of the temperature and humidity profiles is physically reasonable, though the model is conservative in its prediction of heating rates in regions where the optical depth changes quickly. Differences exist in the equilibrium climate simulated when using the neural network, which are attributed to small systematic errors that accumulate over time. Thus, we find that the accuracy of the neural network in the “offline” mode does not reflect its performance when coupled with other components.</p>https://gmd.copernicus.org/articles/13/4399/2020/gmd-13-4399-2020.pdf
collection DOAJ
language English
format Article
sources DOAJ
author Y. Liu
R. Caballero
J. M. Monteiro
spellingShingle Y. Liu
R. Caballero
J. M. Monteiro
RadNet 1.0: exploring deep learning architectures for longwave radiative transfer
Geoscientific Model Development
author_facet Y. Liu
R. Caballero
J. M. Monteiro
author_sort Y. Liu
title RadNet 1.0: exploring deep learning architectures for longwave radiative transfer
title_short RadNet 1.0: exploring deep learning architectures for longwave radiative transfer
title_full RadNet 1.0: exploring deep learning architectures for longwave radiative transfer
title_fullStr RadNet 1.0: exploring deep learning architectures for longwave radiative transfer
title_full_unstemmed RadNet 1.0: exploring deep learning architectures for longwave radiative transfer
title_sort radnet 1.0: exploring deep learning architectures for longwave radiative transfer
publisher Copernicus Publications
series Geoscientific Model Development
issn 1991-959X
1991-9603
publishDate 2020-09-01
description <p>Simulating global and regional climate at high resolution is essential to study the effects of climate change and capture extreme events affecting human populations. To achieve this goal, the scalability of climate models and efficiency of individual model components are both important. Radiative transfer is among the most computationally expensive components in a typical climate model. Here we attempt to model this component using a neural network. We aim to study the feasibility of replacing an explicit, physics-based computation of longwave radiative transfer by a neural network emulator and assessing the resultant performance gains. We compare multiple neural-network architectures, including a convolutional neural network, and our results suggest that the performance loss from the use of conventional convolutional networks is not offset by gains in accuracy. We train the networks with and without noise added to the input profiles and find that adding noise improves the ability of the networks to generalise beyond the training set. Prediction of radiative heating rates using our neural network models achieve up to 370<span class="inline-formula">×</span> speedup on a GTX 1080 GPU setup and 11<span class="inline-formula">×</span> speedup on a Xeon CPU setup compared to the a state-of-the-art radiative transfer library running on the same Xeon CPU. Furthermore, our neural network models yield less than 0.1&thinsp;K&thinsp;d<span class="inline-formula"><sup>−1</sup></span> mean squared error across all pressure levels. Upon introducing this component into a single-column model, we find that the time evolution of the temperature and humidity profiles is physically reasonable, though the model is conservative in its prediction of heating rates in regions where the optical depth changes quickly. Differences exist in the equilibrium climate simulated when using the neural network, which are attributed to small systematic errors that accumulate over time. Thus, we find that the accuracy of the neural network in the “offline” mode does not reflect its performance when coupled with other components.</p>
url https://gmd.copernicus.org/articles/13/4399/2020/gmd-13-4399-2020.pdf
work_keys_str_mv AT yliu radnet10exploringdeeplearningarchitecturesforlongwaveradiativetransfer
AT rcaballero radnet10exploringdeeplearningarchitecturesforlongwaveradiativetransfer
AT jmmonteiro radnet10exploringdeeplearningarchitecturesforlongwaveradiativetransfer
_version_ 1724518386499584000