The scope of the theory of neural networks has been expanding rapidly. The statistical-mechanical techniques stemming from the theory of spin glasses have been playing important roles in the analysis of model systems. This chapter summarizes basic concepts, such as neurons, synapses, axons, and studies the characteristics of networks with interneuron connections given by a specific prescription called the generalized Hebb rule. An important goal is to elucidate the structure of the phase diagram with various parameters as their axes, such as the temperature and number of memorized patterns. Related is the capacity of a network, which means the number of patterns that can be memorized. The problem of learning, where the connections gradually change according to some rules to achieve specified goals, is delegated to the next chapter.
Oxford Scholarship Online requires a subscription or purchase to access the full text of books within the service. Public users can however freely search the site and view the abstracts and keywords for each book and chapter.
If you think you should have access to this title, please contact your librarian.