We assume our timeseries data are noisy observations $\tilde x$ from a diffusion
process following the Ito stochastic differential equation
$$dx(t)= \sigma(t) \ dW(t) + b(t) \ dt,$$
where $W$ is a Brownian motion on a filtered probability space. Let
$\sigma$ and $b$ be random processes, adapted to the Brownian filtration.
The integrated variance of the process over the time interval $[0,T]$ is defined as
$$\int_0^T \sigma^2(t) dt.$$
For any positive integer $n$, let ${\cal S}_{n}:=\{ 0=t_{0}\leq \cdots
\leq t_{n}=T \}$ be the observation times.
The observations are affected by i.i.d. noise terms $\eta(t_i)$ with
mean zero and finite variance
$$\tilde x(t_i)=x(t_i)+\eta(t_i).$$
See the Reference for further mathematical details.
Moreover, let $\delta_i(\tilde x):= \tilde x(t_{i+1})-\tilde x(t_i)$ be
the increments of $\tilde x$.
The optimal cutting frequency $N$ for computing the Fourier estimator of
the integrated variance from noisy timeseries data is obtained by
minimization of the estimated MSE.
The Fourier estimator of the integrated variance over $[0,T]$, is
then defined as
$$\widehat\sigma^{2}_{n,N}:= {T^2 \over {2N+1}}\sum_{|s|\leq N} c_s(d\tilde x_n)
c_{-s}(d\tilde x_n),$$
where for any integer $k$, $|k|\leq N$, the discretized Fourier
coefficients of the increments are
$$c_k(d\tilde x_{n}):= {1\over {T}} \sum_{i=0}^{n-1} e^{-{\rm i} {{2\pi}\over {T}}
kt_i}\delta_i(\tilde x).$$