Neural networks

作者: spraysss | 来源:发表于2020-01-28 18:57 被阅读0次

Layer

  • input layer
  • output layer
  • hidden layer

a^{(j)}_i is "activation" of unit i in layer j
\Theta^{j} is matrix of weights controlling function mapping from layer j to layer j+1

a^{2}_1=g(\Theta_{10}^{(1)}x_0+\Theta_{11}^{(1)}x_1+\Theta_{12}^{(1)}x_2+\Theta_{13}^{(1)}x_3)
a^{2}_2=g(\Theta_{20}^{(1)}x_0+\Theta_{21}^{(1)}x_1+\Theta_{22}^{(1)}x_2+\Theta_{23}^{(1)}x_3)
a^{2}_2=g(\Theta_{30}^{(1)}x_0+\Theta_{31}^{(1)}x_1+\Theta_{32}^{(1)}x_2+\Theta_{33}^{(1)}x_3)
h_\theta(x)=g(\Theta_{10}^{(1)}a_0+\Theta_{11}^{(1)}a_1+\Theta_{12}^{(1)}a_2+\Theta_{13}^{(1)}a_3)

if network has s_j units in layer j,s_{j+1} units in layer j+1,then \Theta^{(j)} will be of dimension s_{j+1}\times (s_j+1)

XNOR implement use Neural networks

multi-class classfication use Neural networks


相关文章

网友评论

    本文标题:Neural networks

    本文链接:https://www.haomeiwen.com/subject/lcqmthtx.html