A Closed-Form Upper Bound for Admissible Learning-Rate Steps in Belief-Space Dynamics
May 7, 2026
著者: Zixi Li, Youzhen Li
cs.AI
要旨
Learning-rate steps are usually treated as hyperparameters. This paper isolates a local beliefspace calculation: when an update is modeled as a projected forward step on the probability simplex, admissibility means contractivity in the natural KL/Bregman geometry. Under this model, the upper bound of an admissible step is not a tuning slogan but a formula.