<ul data-eligibleForWebStory="false">The study analyzes the layerwise effective dimension in fully-connected ReLU networks of finite width.Closed-form expressions for the expected rank of the hidden activation matrices for a fixed batch of inputs are derived.Main result indicates that the rank deficit decays geometrically with a ratio of 0.3634.The oscillatory rank behavior observed is a finite-width phenomenon in random ReLU networks, shedding light on deep network expressivity.