menu
techminis

A naukri.com initiative

google-web-stories
Home

>

ML News

>

Distilling...
source image

Arxiv

1w

read

323

img
dot

Image Credit: Arxiv

Distilling Knowledge from Heterogeneous Architectures for Semantic Segmentation

  • Current knowledge distillation methods for semantic segmentation focus on guiding the student to imitate the teacher's knowledge within homogeneous architectures.
  • A generic knowledge distillation method for semantic segmentation from a heterogeneous perspective called HeteroAKD is proposed.
  • HeteroAKD eliminates the influence of architecture-specific information by projecting intermediate features of the teacher and student into an aligned logits space.
  • HeteroAKD outperforms state-of-the-art KD methods in facilitating distillation between heterogeneous architectures.

Read Full Article

like

19 Likes

For uninterrupted reading, download the app