Researchers investigate deep morphological neural networks (DMNNs) and emphasize the importance of activations between layers.
They introduce new architectures for DMNNs with different parameter constraints, showcasing successful training and improved pruning capabilities compared to linear networks.
This study is the first successful attempt to train DMNNs under specific constraints, although the networks' generalization capabilities are limited.
Additionally, a hybrid network architecture combining linear and morphological layers is proposed, demonstrating faster convergence of gradient descent with large batches.