文档详情

神经网络与机器学习.pptx

发布:2021-12-04约2.38千字共21页下载文档
文本预览下载声明
Chapter 11 Stochastic Methods Rooted in the Statistical Mechanics;Figure 11.1 A periodic recurrent Markov chain with d = 3.;Figure 11.2 State-transition diagram of Markov chain for Example 1: The states x1 and x2 and may be identified as up-to-date behind, respectively.;Figure 11.3 State-transition diagram of Markov chain for Example 2. ;Figure 11.4 Classification of the states of a Markov chain and their associated long-term behavior.;Table 11.1;Figure 11.5 Architectural graph of Botzmann machine; K is the number of visible neurons, and L is the number of hidden neurons. The distinguishing features of the machine are: 1. The connections between the visible and hidden neurons are symmetric. 2. The symmetric connections are extended to the visible and hidden neurons.;Figure 11.6 Sigmoid-shaped function P(v).;Figure 11.7 Directed (logistic) belief network.;Figure 11.8 Neural structure of restricted Boltzmann machine (RBM). Contrasting this with that of Fig. 11.6, we see that unlike the Boltzmann machine, there are no connections among the visible neurons and the hidden neurons in the RBM.;Figure 11.9 Top-down learning, using logistic belief network of infinite depth.;Figure 11.10 A hybrid generative model in which the two top layers form a restricted Boltzmann machine and the lower two layers form a directed model. The weights shown with blue shaded arrows are not part of the generative model; they are used to infer the feature values given to the data, but they are not used for generating data.;Figure 11.11 Illustrating the progression of alternating Gibbs sampling in an RBM. After sufficiently many steps, the visible and hidden vectors are sampled from the stationary distribution defined by the current parameters of the model.;Figure 11.12 The task of modeling the sensory data is divided into two subtasks.;Table 11.2;Figure 11.13 Clustering at various phases. The lines are equiprobability contours, p = ? in (b), and p = ? elsewhere: (a) 1
显示全部
相似文档