Pluggable Micronetwork for Layer Configuration Relay in a Dynamic Deep Neural Surface
The classical convolution neural network architecture adheres to static declaration procedures, which means that the shape of computation is usually predefined and the computation graph is fixed. In this research, the concept of a pluggable micronetwork, which relaxes the static declaration constrai...
Main Authors: | , , |
---|---|
Format: | Article |
Language: | English |
Published: |
IEEE
2021-01-01
|
Series: | IEEE Access |
Subjects: | |
Online Access: | https://ieeexplore.ieee.org/document/9530389/ |
id |
doaj-b1bb0f1e7a394003ad07f25ed06cd16d |
---|---|
record_format |
Article |
spelling |
doaj-b1bb0f1e7a394003ad07f25ed06cd16d2021-09-14T23:01:18ZengIEEEIEEE Access2169-35362021-01-01912483112484610.1109/ACCESS.2021.31107099530389Pluggable Micronetwork for Layer Configuration Relay in a Dynamic Deep Neural SurfaceFarhat Ullah Khan0https://orcid.org/0000-0001-7193-0895Izzatdin B. Aziz1https://orcid.org/0000-0003-2654-4463Emilia Akashah P. Akhir2https://orcid.org/0000-0002-7620-6625Center for Research in Data Science (CERDAS), Universiti Teknologi PETRONAS, Seri Iskander, Perak, MalaysiaCenter for Research in Data Science (CERDAS), Universiti Teknologi PETRONAS, Seri Iskander, Perak, MalaysiaCenter for Research in Data Science (CERDAS), Universiti Teknologi PETRONAS, Seri Iskander, Perak, MalaysiaThe classical convolution neural network architecture adheres to static declaration procedures, which means that the shape of computation is usually predefined and the computation graph is fixed. In this research, the concept of a pluggable micronetwork, which relaxes the static declaration constraint by dynamic layer configuration relay, is proposed. The micronetwork consists of several parallel convolutional layer configurations and relays only the layer settings, incurring a minimum loss. The configuration selection logic is based on the conditional computation method, which is implemented as an output layer of the proposed micronetwork. The proposed micronetwork is implemented as an independent pluggable unit and can be used anywhere on the deep learning decision surface with no or minimal configuration changes. The MNIST, FMNIST, CIFAR-10 and STL-10 datasets have been used to validate the proposed research. The proposed technique is proven to be efficient and achieves appropriate validity of the research by obtaining state-of-the-art performance in fewer iterations with wider and compact convolution models. We also naively attempt to discuss the involved computational complexities in these advanced deep neural structures.https://ieeexplore.ieee.org/document/9530389/Convolution neural networkdeep learningdynamic neural structuremicronetworkmultilayer perceptron |
collection |
DOAJ |
language |
English |
format |
Article |
sources |
DOAJ |
author |
Farhat Ullah Khan Izzatdin B. Aziz Emilia Akashah P. Akhir |
spellingShingle |
Farhat Ullah Khan Izzatdin B. Aziz Emilia Akashah P. Akhir Pluggable Micronetwork for Layer Configuration Relay in a Dynamic Deep Neural Surface IEEE Access Convolution neural network deep learning dynamic neural structure micronetwork multilayer perceptron |
author_facet |
Farhat Ullah Khan Izzatdin B. Aziz Emilia Akashah P. Akhir |
author_sort |
Farhat Ullah Khan |
title |
Pluggable Micronetwork for Layer Configuration Relay in a Dynamic Deep Neural Surface |
title_short |
Pluggable Micronetwork for Layer Configuration Relay in a Dynamic Deep Neural Surface |
title_full |
Pluggable Micronetwork for Layer Configuration Relay in a Dynamic Deep Neural Surface |
title_fullStr |
Pluggable Micronetwork for Layer Configuration Relay in a Dynamic Deep Neural Surface |
title_full_unstemmed |
Pluggable Micronetwork for Layer Configuration Relay in a Dynamic Deep Neural Surface |
title_sort |
pluggable micronetwork for layer configuration relay in a dynamic deep neural surface |
publisher |
IEEE |
series |
IEEE Access |
issn |
2169-3536 |
publishDate |
2021-01-01 |
description |
The classical convolution neural network architecture adheres to static declaration procedures, which means that the shape of computation is usually predefined and the computation graph is fixed. In this research, the concept of a pluggable micronetwork, which relaxes the static declaration constraint by dynamic layer configuration relay, is proposed. The micronetwork consists of several parallel convolutional layer configurations and relays only the layer settings, incurring a minimum loss. The configuration selection logic is based on the conditional computation method, which is implemented as an output layer of the proposed micronetwork. The proposed micronetwork is implemented as an independent pluggable unit and can be used anywhere on the deep learning decision surface with no or minimal configuration changes. The MNIST, FMNIST, CIFAR-10 and STL-10 datasets have been used to validate the proposed research. The proposed technique is proven to be efficient and achieves appropriate validity of the research by obtaining state-of-the-art performance in fewer iterations with wider and compact convolution models. We also naively attempt to discuss the involved computational complexities in these advanced deep neural structures. |
topic |
Convolution neural network deep learning dynamic neural structure micronetwork multilayer perceptron |
url |
https://ieeexplore.ieee.org/document/9530389/ |
work_keys_str_mv |
AT farhatullahkhan pluggablemicronetworkforlayerconfigurationrelayinadynamicdeepneuralsurface AT izzatdinbaziz pluggablemicronetworkforlayerconfigurationrelayinadynamicdeepneuralsurface AT emiliaakashahpakhir pluggablemicronetworkforlayerconfigurationrelayinadynamicdeepneuralsurface |
_version_ |
1717379491192373248 |