A new paper introduces a unified framework for N-tuples weak supervision in supervised learning to reduce annotation burden.
The framework is based on empirical risk minimization and incorporates pointwise unlabeled data to improve learning performance.
The paper unifies data generation processes for N-tuples and pointwise unlabeled data and provides a generalization error bound for theoretical support.
Extensive experiments on benchmark datasets confirm the effectiveness of the framework in improving generalization across different N-tuples learning tasks.