Microsoft researchers have developed BitNet b1.58 2B4T, the largest-scale 1-bit AI model.
The model uses quantized weights of -1, 0, and 1 to reduce memory and computing requirements.
Trained on a massive dataset of 4 trillion tokens, the model outperforms traditional models of similar sizes.
BitNet b1.58 2B4T is faster and uses less memory compared to similar models, but it requires Microsoft's custom framework and has limited hardware compatibility.