Kernel ridge regression (KRR) is a fundamental method for learning functions from finite samples.
A comprehensive study of KRR in the noiseless regime reveals optimal convergence rates determined by eigenvalue decay and target function's smoothness.
KRR exhibits extra-smoothness compared to typical functions in the native reproducing kernel Hilbert space (RKHS).
A novel error bound for noisy KRR achieves minimax optimality in both noiseless and noisy regimes.