The output on the convolutional layer is generally handed throughout the ReLU activation operate to bring non-linearity to your model. It's going to take the feature map and replaces all of the unfavorable values with zero. Understanding the complexity in the model In an effort to evaluate the complexity https://financefeeds.com/understanding-exantes-relationship-with-sec-regulations-compliance-and-integrity-in-global-trading/