AxBERT is an interpretable deep learning model proposed for Chinese spelling correction.It aligns with an associative knowledge network (AKN) constructed based on co-occurrence relations among Chinese characters.A translator matrix between BERT and AKN is introduced for alignment and regulation of the attention component in BERT.Experimental results on SIGHAN datasets show that AxBERT achieves extraordinary performance and interpretability.