The paper introduces a framework called FLIP to evaluate federated prompt learning algorithms.FLIP assesses the performance of 8 state-of-the-art federated prompt learning methods across various scenarios.The findings demonstrate that prompt learning maintains strong generalization performance with minimal resource consumption.This work emphasizes the effectiveness of federated prompt learning in data scarcity and cross-domain distributional shift scenarios.