Depth estimation and scene segmentation are crucial tasks in intelligent transportation systems, and joint modeling of these tasks can reduce storage and training requirements.
This work introduces an adaptive multi-task distillation method to enhance unified modeling by dynamically adjusting the knowledge transfer from multiple teachers based on the student's learning ability.
To prevent knowledge forgetfulness during distillation with multiple teachers, a knowledge trajectory is proposed to maintain essential information learned by the model. This trajectory-based distillation loss helps guide the student model effectively.
Evaluation on benchmark datasets like Cityscapes and NYU-v2 shows that the proposed method outperforms existing solutions, with the code available in the supplementary materials.