Here is how to implement a fully connected layer in Matlab using matrix multiplication:
main.m1383 chars43 lines
The FullyConnectedLayer
class has a constructor that takes the layer's input size and output size as arguments. Upon initialization, the layer's weights are randomly initialized using Xavier initialization, and the bias is set to zero.
Forward propagation is implemented using matrix multiplication of the layer's weights and input, along with the bias term.
Backward propagation updates the weights and bias using gradient descent, computes the gradient of the loss with respect to input, and returns the gradient of the loss with respect to input.
gistlibby LogSnag