All convolutions within a dense block are ReLU-activated and use batch normalization. Channel-intelligent concatenation is simply probable if the height and width Proportions of the data keep on being unchanged, so convolutions inside a dense block are all of stride 1. Pooling layers are inserted between dense blocks for https://financefeeds.com/this-copyright-is-set-to-surpass-btc-and-eth-a-moonshot-you-cant-miss/