Neurally inspired silicon learning : from synapse transistors to learning arrays

A computation is an operation that can be performed by a physical machine. We are familiar with digital computers: Machines based on a simple logic function (the binary NOR) and optimized for manipulating numeric variables with high precision. Other computing machines exist: The neurocomputer, the...

Full description

Bibliographic Details
Main Author: Diorio, Christopher J.
Format: Others
Language:en
Published: 1997
Online Access:https://thesis.library.caltech.edu/88/1/Diorio_c_1997.pdf
Diorio, Christopher J. (1997) Neurally inspired silicon learning : from synapse transistors to learning arrays. Dissertation (Ph.D.), California Institute of Technology. doi:10.7907/vbyq-fy15. https://resolver.caltech.edu/CaltechETD:etd-01092008-080326 <https://resolver.caltech.edu/CaltechETD:etd-01092008-080326>
id ndltd-CALTECH-oai-thesis.library.caltech.edu-88
record_format oai_dc
spelling ndltd-CALTECH-oai-thesis.library.caltech.edu-882021-04-17T05:01:31Z https://thesis.library.caltech.edu/88/ Neurally inspired silicon learning : from synapse transistors to learning arrays Diorio, Christopher J. A computation is an operation that can be performed by a physical machine. We are familiar with digital computers: Machines based on a simple logic function (the binary NOR) and optimized for manipulating numeric variables with high precision. Other computing machines exist: The neurocomputer, the analog computer, the quantum computer, and the DNA computer all are known. Neurocomputers-defined colloquially as computing machines comprising nervous tissue-exist; that they are computers also is certain. Nervous tissue solves ill-posed problems in real time. The principles underlying neural computation, however, remain for now a mystery. I believe that there are fundamental principles of computation that we can learn by studying neurobiology. If we can understand how biological information-processing systems operate, then we can learn how to build circuits and systems that deal naturally with real-world data. My goal is to investigate the organizational and adaptive principles on which neural systems operate, and to build silicon integrated circuits that compute using these principles. I call my approach silicon neuroscience: the development of neurally inspired silicon-learning systems. I have developed, in a standard CMOS process, a family of single-transistor devices that I call synapse transistors. Like neural synapses, synapse transistors provide nonvolatile analog memory, compute the product of this stored memory and the applied input, allow bidirectional memory updates, and simultaneously perform an analog computation and determine locally their own memory updates. I have fabricated a synaptic array that affords a high synapse-transistor density, mimics the low power consumption of nervous tissue, and performs both fast, parallel computation and slow, local adaptation. Like nervous tissue, my array simultaneously and in parallel performs an analog computation and updates the nonvolatile analog memory. Although I do not believe that a single transistor can model the complex behavior of a neural synapse completely, my synapse transistors do implement a local learning function. I consider their development to be a first step toward achieving my goal of a silicon learning system. 1997 Thesis NonPeerReviewed application/pdf en other https://thesis.library.caltech.edu/88/1/Diorio_c_1997.pdf Diorio, Christopher J. (1997) Neurally inspired silicon learning : from synapse transistors to learning arrays. Dissertation (Ph.D.), California Institute of Technology. doi:10.7907/vbyq-fy15. https://resolver.caltech.edu/CaltechETD:etd-01092008-080326 <https://resolver.caltech.edu/CaltechETD:etd-01092008-080326> https://resolver.caltech.edu/CaltechETD:etd-01092008-080326 CaltechETD:etd-01092008-080326 10.7907/vbyq-fy15
collection NDLTD
language en
format Others
sources NDLTD
description A computation is an operation that can be performed by a physical machine. We are familiar with digital computers: Machines based on a simple logic function (the binary NOR) and optimized for manipulating numeric variables with high precision. Other computing machines exist: The neurocomputer, the analog computer, the quantum computer, and the DNA computer all are known. Neurocomputers-defined colloquially as computing machines comprising nervous tissue-exist; that they are computers also is certain. Nervous tissue solves ill-posed problems in real time. The principles underlying neural computation, however, remain for now a mystery. I believe that there are fundamental principles of computation that we can learn by studying neurobiology. If we can understand how biological information-processing systems operate, then we can learn how to build circuits and systems that deal naturally with real-world data. My goal is to investigate the organizational and adaptive principles on which neural systems operate, and to build silicon integrated circuits that compute using these principles. I call my approach silicon neuroscience: the development of neurally inspired silicon-learning systems. I have developed, in a standard CMOS process, a family of single-transistor devices that I call synapse transistors. Like neural synapses, synapse transistors provide nonvolatile analog memory, compute the product of this stored memory and the applied input, allow bidirectional memory updates, and simultaneously perform an analog computation and determine locally their own memory updates. I have fabricated a synaptic array that affords a high synapse-transistor density, mimics the low power consumption of nervous tissue, and performs both fast, parallel computation and slow, local adaptation. Like nervous tissue, my array simultaneously and in parallel performs an analog computation and updates the nonvolatile analog memory. Although I do not believe that a single transistor can model the complex behavior of a neural synapse completely, my synapse transistors do implement a local learning function. I consider their development to be a first step toward achieving my goal of a silicon learning system.
author Diorio, Christopher J.
spellingShingle Diorio, Christopher J.
Neurally inspired silicon learning : from synapse transistors to learning arrays
author_facet Diorio, Christopher J.
author_sort Diorio, Christopher J.
title Neurally inspired silicon learning : from synapse transistors to learning arrays
title_short Neurally inspired silicon learning : from synapse transistors to learning arrays
title_full Neurally inspired silicon learning : from synapse transistors to learning arrays
title_fullStr Neurally inspired silicon learning : from synapse transistors to learning arrays
title_full_unstemmed Neurally inspired silicon learning : from synapse transistors to learning arrays
title_sort neurally inspired silicon learning : from synapse transistors to learning arrays
publishDate 1997
url https://thesis.library.caltech.edu/88/1/Diorio_c_1997.pdf
Diorio, Christopher J. (1997) Neurally inspired silicon learning : from synapse transistors to learning arrays. Dissertation (Ph.D.), California Institute of Technology. doi:10.7907/vbyq-fy15. https://resolver.caltech.edu/CaltechETD:etd-01092008-080326 <https://resolver.caltech.edu/CaltechETD:etd-01092008-080326>
work_keys_str_mv AT dioriochristopherj neurallyinspiredsiliconlearningfromsynapsetransistorstolearningarrays
_version_ 1719396526776647680