Summary: | 碩士 === 國立臺灣師範大學 === 數學研究所 === 91 === Abstract. We propose a solution to a fundamental problem in neural nets : " Stored an
arbitrary set of fundamental memories, does there exist a recursive network for which these
fundamental memories are stable equilibrium states of the network ? " The heart of it is
the conception of the emergent set, a Hamming star-convexity packing in the n-cube, the
mathematical framework of Hebb's strengthened learning rule, and the CAM algorithm. We
prove that the set of stable equilibrium states of the threshold network constructed by Hebb's
strengthened learning rule that responds to incoming signals of the states of fundamental
memories is the 01-span of the emergence of fundamental memories. On this basis, we reduce
the question to a problem for constructing a threshold network with sparse connections that
responds to incoming signals of the states of a generator of fundamental memories, and thereby
probing the collective dynamics of the network. One of the great intellectual challenges is to
nd the mechanism for storage of memory. The solution of the Content-Addressable Memory
Problem indicates a mechanism for storage of memory that a network produced in the brains
by sucking the kernel of the received stored memory items as incoming signals can correctly
yield the entire memory items on the basis of sucient partial information by the chaotic
dynamics with a regular strategy-set.
|