MONAS:Multi-Objective Neural Architecture Search using Reinforcement Learning
碩士 === 國立清華大學 === 資訊工程學系所 === 106 === Recent studies on neural architecture search have shown that automatically designed neural networks perform as good as human-designed architectures. While most existing works on neural architecture search aim at finding architectures that optimize for prediction...
Main Authors: | , |
---|---|
Other Authors: | |
Format: | Others |
Language: | en_US |
Published: |
2018
|
Online Access: | http://ndltd.ncl.edu.tw/handle/s6yz2y |
id |
ndltd-TW-106NTHU5392045 |
---|---|
record_format |
oai_dc |
spelling |
ndltd-TW-106NTHU53920452019-05-16T00:52:40Z http://ndltd.ncl.edu.tw/handle/s6yz2y MONAS:Multi-Objective Neural Architecture Search using Reinforcement Learning 應用強化學習方法之多目標類神經網路架構探索 Hsu, Chi-Hung 許啟宏 碩士 國立清華大學 資訊工程學系所 106 Recent studies on neural architecture search have shown that automatically designed neural networks perform as good as human-designed architectures. While most existing works on neural architecture search aim at finding architectures that optimize for prediction accuracy. These methods may generate complex architectures consuming excessively high energy consumption, which is not suitable for computing environment with limited power budgets. We propose MONAS, a Multi-Objective Neural Architecture Search with novel reward functions that consider both prediction accuracy and power consumption when exploring neural architectures. MONAS effectively explores the design space and searches for architectures satisfying the given requirements. The experimental results demonstrate that the architectures found by MONAS achieve accuracy comparable to or better than the state-of-the-art models, while having better energy efficiency. Chang, Shih-Chieh 張世杰 2018 學位論文 ; thesis 24 en_US |
collection |
NDLTD |
language |
en_US |
format |
Others
|
sources |
NDLTD |
description |
碩士 === 國立清華大學 === 資訊工程學系所 === 106 === Recent studies on neural architecture search have shown that automatically designed neural networks perform as good as human-designed architectures. While most existing works on neural architecture search aim at finding architectures that optimize for prediction accuracy. These methods may generate complex architectures consuming excessively high energy consumption, which is not suitable for computing environment with limited power budgets. We propose MONAS, a Multi-Objective Neural Architecture Search with novel reward functions that consider both prediction accuracy and power consumption when exploring neural architectures. MONAS effectively explores the design space and searches for architectures satisfying the given requirements. The experimental results demonstrate that the architectures found by MONAS achieve accuracy comparable to or better than the state-of-the-art models, while having better energy efficiency.
|
author2 |
Chang, Shih-Chieh |
author_facet |
Chang, Shih-Chieh Hsu, Chi-Hung 許啟宏 |
author |
Hsu, Chi-Hung 許啟宏 |
spellingShingle |
Hsu, Chi-Hung 許啟宏 MONAS:Multi-Objective Neural Architecture Search using Reinforcement Learning |
author_sort |
Hsu, Chi-Hung |
title |
MONAS:Multi-Objective Neural Architecture Search using Reinforcement Learning |
title_short |
MONAS:Multi-Objective Neural Architecture Search using Reinforcement Learning |
title_full |
MONAS:Multi-Objective Neural Architecture Search using Reinforcement Learning |
title_fullStr |
MONAS:Multi-Objective Neural Architecture Search using Reinforcement Learning |
title_full_unstemmed |
MONAS:Multi-Objective Neural Architecture Search using Reinforcement Learning |
title_sort |
monas:multi-objective neural architecture search using reinforcement learning |
publishDate |
2018 |
url |
http://ndltd.ncl.edu.tw/handle/s6yz2y |
work_keys_str_mv |
AT hsuchihung monasmultiobjectiveneuralarchitecturesearchusingreinforcementlearning AT xǔqǐhóng monasmultiobjectiveneuralarchitecturesearchusingreinforcementlearning AT hsuchihung yīngyòngqiánghuàxuéxífāngfǎzhīduōmùbiāolèishénjīngwǎnglùjiàgòutànsuǒ AT xǔqǐhóng yīngyòngqiánghuàxuéxífāngfǎzhīduōmùbiāolèishénjīngwǎnglùjiàgòutànsuǒ |
_version_ |
1719171365873909760 |