Summary: | 碩士 === 東海大學 === 資訊工程與科學系碩士在職專班 === 94 === In order to widely apply Hebbian-type Associative Memories in every field, the most common way at present time is utilizing VLSI to make Hebbian-type Associative Memories. However, as pattern saving increasing, the number of times between neuron of traditional Hebbian-type Associative Memories and interconnection will increase quickly simultaneously. It comes to bottleneck when actually manufacture VLSI. There are two directions to solve this problem. One is to develop high-rank Hebbian-type Associative Memories, the other is to decrease interconnection inside of Hebbian-type Associative Memories. Although high-rank Hebbian-type Associative Memories may add to save patterns, yet, the connection inside will inevitably increase rapidly. Therefore, how to decrease interconnection is the radical solution to settle this problem once for all.
Making use of quantization strategy to decrease interconnection is quite an efficient way. Chung & Tsai has focused on connection numerical quantization of Hebbian-type Associative Memories to make analysis. They found Hebbian-type Associative Memories of fairly good contractility after quantization. The strategy they applied are two-level, three-level and linear quantization. One important characteristics of Hebbian-type Associative Memories is its interconnection value owns Gauss scattering distinction. It will enhance performance of Hebbian-type Associative Memories if making use of its distinction as the strategy in applying non-linear quantization.
In this research, we introduce one order and quadratic Hebbian-type Associative Memories, derive equation of probability of direct convergence from linear quantization strategy. The key-point of this research is non-linear quantization strategy is integrated by Gauss possibility density function, then, divide area according to requirements and equal every segmentation area to seek the length that every segmentation possesses on X-axis, next, calculate up & down limit value based on proportion of length of every segmentation area to the whole area. Afterwards, carry this value back to the equation of probability of direct convergence after original linear quantization and become equation of probability of direct convergence after non-linear quantization of Hebbian-type Associative Memories. Hence, We may compare which one is superior by investigating between linear & non-linear equation of probability of direct convergence
Comparing experiment results between linear & non-linear quantization equation of probability of direct convergence, we may clearly observe the performance of probability of convergence in non-linear quantization strategy is far more superior than in linear quantization strategy. Therefore, when producing tip, non-linear quantization strategy owns more practically merits in Hebbian-type Associative Memories.
|