1d resnet in matlab

Here is an example of how to implement a 1D ResNet in MATLAB using the Deep Learning Toolbox:

main.m
% Load and prepare the data
[XTrain, YTrain] = prepareData();

% Define the network architecture
layers = [
    sequenceInputLayer(size(XTrain, 2), 'Name', 'InputLayer')
    convolution1dLayer(64, 3, 'Padding', 'same', 'Name', 'Conv1')
    batchNormalizationLayer('Name', 'BN1')
    reluLayer('Name', 'ReLU1')
    maxPooling1dLayer(2, 'Stride', 2, 'Name', 'MaxPool1')
    residualBlock(64, 'Block2')
    residualBlock(64, 'Block3')
    globalAveragePooling1dLayer('Name', 'GlobalAvgPool1')
    fullyConnectedLayer(10, 'Name', 'FC1')
    softmaxLayer('Name', 'Softmax')
    classificationLayer('Name', 'OutputLayer')
];

% Specify training options
options = trainingOptions('adam', ...
    'InitialLearnRate', 0.001, ...
    'MaxEpochs', 20, ...
    'MiniBatchSize', 128, ...
    'Plots', 'training-progress');

% Train the network
net = trainNetwork(XTrain, YTrain, layers, options);

function block = residualBlock(numFilters, tag)
block = [
    convolution1dLayer(numFilters, 3, 'Padding', 'same', 'Name', [tag, 'Conv1'])
    batchNormalizationLayer('Name', [tag, 'BN1'])
    reluLayer('Name', [tag, 'ReLU1'])
    convolution1dLayer(numFilters, 3, 'Padding', 'same', 'Name', [tag, 'Conv2'])
    batchNormalizationLayer('Name', [tag, 'BN2'])
    additionLayer(2, 'Name', tag)
    reluLayer('Name', [tag, 'ReLU2'])
];
end

function [XTrain, YTrain] = prepareData()
load gendata.mat
XTrain = permute(gendata,[2 3 1]);
YTrain = categorical(repmat(1:10,1,500));
end
1463 chars
46 lines

This example defines a 1D ResNet with two residual blocks, each consisting of two convolutional layers with batch normalization and ReLU activations, an addition layer to add the input and output of the first convolutional layer, and a final ReLU activation. The output of the residual blocks is passed through a global average pooling layer and then logits and cross-entropy loss are computed in the fully-connected layer and softmax layer. The model is trained for 20 epochs with Adam optimizer.

Note that the prepareData function loads and formats the input data in the required format for the trainNetwork function. In this example, the data is a 3D array of size (numFeatures, numObservations, numClasses), where numFeatures is the number of input features, numObservations is the number of training samples, and numClasses is the number of output classes (in this case, 10).

gistlibby LogSnag