Skip to Main content Skip to Navigation
Journal articles

Asymptotic lower bounds in estimating jumps

Abstract : We study the problem of the efficient estimation of the jumps for stochastic processes. We assume that the stochastic jump process $(X_t)_{t \in [0,1]}$ is observed discretely, with a sampling step of size $1/n$. In the spirit of Hajek's convolution theorem, we show some lower bounds for the estimation error of the sequence of the jumps $(\Delta X_{T_k})_k$. As an intermediate result, we prove a LAMN property, with rate $\sqrt{n}$, when the marks of the underlying jump component are deterministic. We deduce then a convolution theorem, with an explicit asymptotic minimal variance, in the case where the marks of the jump component are random. To prove that this lower bound is optimal, we show that a threshold estimator of the sequence of jumps $(\Delta X_{T_k})_k$ based on the discrete observations, reaches the minimal variance of the previous convolution theorem.
Complete list of metadata
Contributor : Emmanuelle Clément Connect in order to contact the contributor
Submitted on : Thursday, February 28, 2013 - 9:52:19 AM
Last modification on : Tuesday, March 8, 2022 - 12:32:02 PM

Links full text



Emmanuelle Clement, Sylvain Delattre, Arnaud Gloter. Asymptotic lower bounds in estimating jumps. Bernoulli, Bernoulli Society for Mathematical Statistics and Probability, 2014, 20 (3), pp.1059-1096. ⟨10.3150/13-BEJ515⟩. ⟨hal-00795403⟩



Record views