On the Entropic Cost of Measuring to Infer - Nathan Shettell

Statistical inference involves measuring multiple systems parameterized by a common unknown parameter, with the aim of reducing uncertainty about that parameter. It is well-established that information processing, such as measurement and erasure, incurs an entropic cost. In this work, we quantify the minimal entropy required to perform inference under general measurement processes, focusing on how correlations between measurements affect this cost. We derive fundamental bounds in two paradigms: one where measurements are performed simultaneously, and another where they are performed sequentially; capturing the roles of spatial and temporal correlations, respectively. In both settings, we show that inter-measurement correlations can act as an entropy reservoir, allowing part of the entropy budget to be effectively recycled when correlations are leveraged. This recycled entropy can be used to perform additional measurements without increasing the overall entropic cost, thereby improving the quality of statistical inference. While developed in the context of inference, our framework applies more broadly, offering a thermodynamic lens on correlated measurement protocols in quantum information.