Here's an example of how to create a ReLU layer from scratch using Matlab:
main.m720 chars30 lines
You can use this ReLU layer by creating an instance of it and adding it to your network:
main.m179 chars10 lines
In this example, the ReLU layer is inserted after the convolutional layer and before the fully connected layer.
gistlibby LogSnag