All convolutions in the dense block are ReLU-activated and use batch normalization. Channel-intelligent concatenation is barely attainable if the peak and width dimensions of the information remain unchanged, so convolutions in a dense block are all of stride 1. Pooling layers are inserted between dense blocks for even more https://financefeeds.com/ethereum-eth-trader-who-has-turned-300-into-1200000-names-1-token-with-similar-long-term-potential/