Asymptotic lower bounds in estimating jumps

Abstract : We study the problem of the efficient estimation of the jumps for stochastic processes. We assume that the stochastic jump process $(X_t)_{t \in [0,1]}$ is observed discretely, with a sampling step of size $1/n$. In the spirit of Hajek's convolution theorem, we show some lower bounds for the estimation error of the sequence of the jumps $(\Delta X_{T_k})_k$. As an intermediate result, we prove a LAMN property, with rate $\sqrt{n}$, when the marks of the underlying jump component are deterministic. We deduce then a convolution theorem, with an explicit asymptotic minimal variance, in the case where the marks of the jump component are random. To prove that this lower bound is optimal, we show that a threshold estimator of the sequence of jumps $(\Delta X_{T_k})_k$ based on the discrete observations, reaches the minimal variance of the previous convolution theorem.
Complete list of metadatas
Contributor : Emmanuelle Clément <>
Submitted on : Thursday, February 28, 2013 - 9:52:19 AM
Last modification on : Tuesday, May 14, 2019 - 12:46:01 PM


  • HAL Id : hal-00795403, version 1


Emmanuelle Clement, Sylvain Delattre, Arnaud Gloter. Asymptotic lower bounds in estimating jumps. Bernoulli, Bernoulli Society for Mathematical Statistics and Probability, 2014, 20 (3), pp.1059-1096. ⟨hal-00795403⟩



Record views