PreNeT is a predictive framework designed to optimize the training time of deep learning models, particularly Transformer-based architectures.It integrates comprehensive computational metrics, including layer-specific parameters, arithmetic operations, and memory utilization.PreNeT accurately predicts training duration on various hardware infrastructures, including novel accelerator architectures.Experimental results show that PreNeT achieves up to 72% improvement in prediction accuracy compared to contemporary frameworks.