create relu layer matlab scratch in matlab

Here's an example of how to create a ReLU layer from scratch using Matlab:

main.m
classdef reluLayer < nnet.layer.Layer
    
    methods
        function layer = reluLayer(name)
            % Set layer name
            layer.Name = name;

            % Set layer description
            layer.Description = "ReLU Layer";

            % Set learnable parameters
            layer.learnableParameters = [];

            % Set layer type
            layer.Type = "ReLU";
        end
        
        function Z = predict(layer, X)
            % ReLU forward propagation
            Z = max(X, 0);
        end
        
        function [dLdX, dLdW] = backward(layer, X, ~, dLdZ, ~)
            % ReLU backward propagation
            dLdX = dLdZ .* (X >= 0);
            dLdW = [];
        end
    end
end
720 chars
30 lines

You can use this ReLU layer by creating an instance of it and adding it to your network:

main.m
relu = reluLayer('my_relu');
net = [
    imageInputLayer([28 28 1])
    convolution2dLayer(5, 10)
    relu
    fullyConnectedLayer(10)
    softmaxLayer
    classificationLayer
];
179 chars
10 lines

In this example, the ReLU layer is inserted after the convolutional layer and before the fully connected layer.

gistlibby LogSnag