Rectified Linear Unit (ReLU): an extra non-linear Procedure is introduced after each individual convolution Procedure. It aims to perform non-linear functionality within our CNN and assistance our design fully grasp info improved. The output function of ReLU is as follows:ease of presentation, this route diagram signifies just one twin in a pair (n