Multi‐style Chinese art painting generation of flowers

Abstract With the proposal and development of Generative Adversarial Networks, the great achievements in the field of image generation are made. Meanwhile, many works related to the generation of painting art have also been derived. However, due to the difficulty of data collection and the fundament...

Full description

Bibliographic Details
Main Authors: Feifei Fu, Jiancheng Lv, Chenwei Tang, Mao Li
Format: Article
Language:English
Published: Wiley 2021-02-01
Series:IET Image Processing
Online Access:https://doi.org/10.1049/ipr2.12059
Description
Summary:Abstract With the proposal and development of Generative Adversarial Networks, the great achievements in the field of image generation are made. Meanwhile, many works related to the generation of painting art have also been derived. However, due to the difficulty of data collection and the fundamental challenge from freehand expressions, the generation of traditional Chinese painting is still far from being perfect. This paper specialises in Chinese art painting generation of flowers, which is important and classic, by deep learning method. First, an unpaired flowers paintings data set containing three classic Chinese painting style: line drawing, meticulous, and ink is constructed. Then, based on the collected dataset, a Flower‐Generative Adversarial Network framework to generate multi‐style Chinese art painting of flowers is proposed. The Flower‐Generative Adversarial Network, consisting of attention‐guided generators and discriminators, transfers the style among line drawing, meticulous, and ink by an adversarial training way. Moreover, in order to solve the problem of artefact and blur in image generation by existing methods, a new loss function called Multi‐Scale Structural Similarity to force the structure preservation is introduced. Extensive experiments show that the proposed Flower‐Generative Adversarial Network framework can produce better and multi‐style Chinese art painting of flowers than existing methods.
ISSN:1751-9659
1751-9667