The article discusses algorithm protection measures in the context of federated learning, focusing on three critical layers.Two categories of protection measures have been identified after analyzing risks.Consideration of qualitative and quantitative characteristics of mechanisms was emphasized for selecting suitable solutions.Confidential Containers (CoCo) technology aims to protect algorithm code and data from hosting companies using various hardware technologies.TEEs, such as CoCo, face security gaps that allow skilled administrators to bypass protection mechanisms, leading to ongoing security concerns.Distroless container images reduce attack surfaces but do not protect algorithm code adequately.Compiled languages like C, C++, and Rust provide better protection against reverse engineering than interpreted languages like Python.Homomorphic Encryption (HE) is highlighted for securing data in federated learning but does not protect algorithms.The article emphasizes the importance of hardware isolation for protecting algorithms.Combining protection mechanisms like compilation, obfuscation, and encryption creates barriers against intellectual property theft.