Assuming that the number of neurons in the current layer is 25, and the number of neurons in the previous layer is 5, the forward pass of a fully connected layer with the given shapes can be implemented in MATLAB as follows:
main.m309 chars9 lines
In this implementation, the weights are randomly initialized with values of shape (25, 5), the biases are randomly initialized with values of shape (25, 1), and the input is randomly initialized with values of shape (5, batch_size).
The forward pass calculation is performed using matrix multiplication of the weights with the input, followed by addition of the biases. The output will have a shape of (25, batch_size), where batch_size is set to 2 in this example.
gistlibby LogSnag