Generative Binary Memory (GBM) is a novel pseudo-replay approach for Class-Incremental Learning (CIL).
GBM generates synthetic binary pseudo-exemplars using Bernoulli Mixture Models (BMMs).
The approach is applicable to any conventional Deep Neural Network (DNN) and supports Binary Neural Networks (BNNs) for embedded systems.
Experimental results show that GBM achieves higher accuracy than state-of-the-art methods on CIFAR100 and TinyImageNet datasets, and outperforms other CIL methods for BNNs with reduced memory usage.