SEMFED is a novel semantic-aware resource-efficient federated learning framework designed for heterogeneous NLP tasks.
SEMFED incorporates a semantic-aware client selection mechanism, adaptive NLP-specific model architectures, and a communication-efficient semantic feature compression technique.
Experimental results show that SEMFED reduces communication costs by 80.5% while maintaining model accuracy above 98% on various NLP classification tasks, outperforming current FL approaches.
SEMFED effectively handles diverse client environments with varying resources, network reliability, and semantic data distributions, making it suitable for real-world federated NLP deployments.