Researchers propose a two-stage hardware-aware neural architecture search (HW-NAS) framework for optimizing the performance of target devices.
The first stage involves training an architecture controller on synthetic devices.
The second stage deploys the learned controller on the target device, without relying on pre-collected information.
The framework enables the controller to design architecture for the target device through high-fidelity latency measurements and in-context adaptation.