Simulator-based training of generative neural networks for the inverse design of metasurfaces
Metasurfaces are subwavelength-structured artificial media that can shape and localize electromagnetic waves in unique ways. The inverse design of these devices is a non-convex optimization problem in a high dimensional space, making global optimization a major challenge. We present a new type of po...
Main Authors: | , |
---|---|
Format: | Article |
Language: | English |
Published: |
De Gruyter
2019-11-01
|
Series: | Nanophotonics |
Subjects: | |
Online Access: | https://doi.org/10.1515/nanoph-2019-0330 |
id |
doaj-a24c7e5b8d2d4ac8be12fd3b8585678c |
---|---|
record_format |
Article |
spelling |
doaj-a24c7e5b8d2d4ac8be12fd3b8585678c2021-09-06T19:20:33ZengDe GruyterNanophotonics2192-86062192-86142019-11-01951059106910.1515/nanoph-2019-0330nanoph-2019-0330Simulator-based training of generative neural networks for the inverse design of metasurfacesJiang Jiaqi0Fan Jonathan A.1Department of Electrical Engineering, Stanford University, 348 Via Pueblo, Stanford, CA 94305, USADepartment of Electrical Engineering, Stanford University, 348 Via Pueblo, Stanford, CA 94305, USAMetasurfaces are subwavelength-structured artificial media that can shape and localize electromagnetic waves in unique ways. The inverse design of these devices is a non-convex optimization problem in a high dimensional space, making global optimization a major challenge. We present a new type of population-based global optimization algorithm for metasurfaces that is enabled by the training of a generative neural network. The loss function used for backpropagation depends on the generated pattern layouts, their efficiencies, and efficiency gradients, which are calculated by the adjoint variables method using forward and adjoint electromagnetic simulations. We observe that the distribution of devices generated by the network continuously shifts towards high performance design space regions over the course of optimization. Upon training completion, the best generated devices have efficiencies comparable to or exceeding the best devices designed using standard topology optimization. Our proposed global optimization algorithm can generally apply to other gradient-based optimization problems in optics, mechanics, and electronics.https://doi.org/10.1515/nanoph-2019-0330simulator-based traininggenerative networksneural networksadjoint variable methodglobal optimization |
collection |
DOAJ |
language |
English |
format |
Article |
sources |
DOAJ |
author |
Jiang Jiaqi Fan Jonathan A. |
spellingShingle |
Jiang Jiaqi Fan Jonathan A. Simulator-based training of generative neural networks for the inverse design of metasurfaces Nanophotonics simulator-based training generative networks neural networks adjoint variable method global optimization |
author_facet |
Jiang Jiaqi Fan Jonathan A. |
author_sort |
Jiang Jiaqi |
title |
Simulator-based training of generative neural networks for the inverse design of metasurfaces |
title_short |
Simulator-based training of generative neural networks for the inverse design of metasurfaces |
title_full |
Simulator-based training of generative neural networks for the inverse design of metasurfaces |
title_fullStr |
Simulator-based training of generative neural networks for the inverse design of metasurfaces |
title_full_unstemmed |
Simulator-based training of generative neural networks for the inverse design of metasurfaces |
title_sort |
simulator-based training of generative neural networks for the inverse design of metasurfaces |
publisher |
De Gruyter |
series |
Nanophotonics |
issn |
2192-8606 2192-8614 |
publishDate |
2019-11-01 |
description |
Metasurfaces are subwavelength-structured artificial media that can shape and localize electromagnetic waves in unique ways. The inverse design of these devices is a non-convex optimization problem in a high dimensional space, making global optimization a major challenge. We present a new type of population-based global optimization algorithm for metasurfaces that is enabled by the training of a generative neural network. The loss function used for backpropagation depends on the generated pattern layouts, their efficiencies, and efficiency gradients, which are calculated by the adjoint variables method using forward and adjoint electromagnetic simulations. We observe that the distribution of devices generated by the network continuously shifts towards high performance design space regions over the course of optimization. Upon training completion, the best generated devices have efficiencies comparable to or exceeding the best devices designed using standard topology optimization. Our proposed global optimization algorithm can generally apply to other gradient-based optimization problems in optics, mechanics, and electronics. |
topic |
simulator-based training generative networks neural networks adjoint variable method global optimization |
url |
https://doi.org/10.1515/nanoph-2019-0330 |
work_keys_str_mv |
AT jiangjiaqi simulatorbasedtrainingofgenerativeneuralnetworksfortheinversedesignofmetasurfaces AT fanjonathana simulatorbasedtrainingofgenerativeneuralnetworksfortheinversedesignofmetasurfaces |
_version_ |
1717776496914857984 |