Model Reduction Through Progressive Latent Space Pruning in Deep Active Inference
Although still not fully understood, sleep is known to play an important role in learning and in pruning synaptic connections. From the active inference perspective, this can be cast as learning parameters of a generative model and Bayesian model reduction, respectively. In this article, we show how...
Main Authors: | Çatal, O. (Author), De Boom, C. (Author), Dhoedt, B. (Author), Verbelen, T. (Author), Wauthier, S.T (Author) |
---|---|
Format: | Article |
Language: | English |
Published: |
Frontiers Media S.A.
2022
|
Subjects: | |
Online Access: | View Fulltext in Publisher |
Similar Items
-
Learning Generative State Space Models for Active Inference
by: Ozan Çatal, et al.
Published: (2020-11-01) -
Active Vision for Robot Manipulators Using the Free Energy Principle
by: Toon Van de Maele, et al.
Published: (2021-03-01) -
Embodied Object Representation Learning and Recognition
by: Çatal, O., et al.
Published: (2022) -
Probabilistic Models with Deep Neural Networks
by: Andrés R. Masegosa, et al.
Published: (2021-01-01) -
Training deep neural density estimators to identify mechanistic models of neural dynamics
by: Pedro J Gonçalves, et al.
Published: (2020-09-01)