Lookup tables (LUTs) are commonly used to store precomputed values for mathematical computations.ReducedLUT is a novel method to reduce the size of LUTs by injecting don't cares into the compression process.This method introduces more self-similarities in LUTs which can be exploited using known decomposition techniques.In a machine learning application, this method achieves up to $1.63 imes$ reduction in LUT utilization with minimal accuracy degradation.