\documentclass[a1,portrait]{a0poster}
%\usepackage{alltt}
\usepackage{color}
\usepackage{times}
\usepackage{a0poster}
\usepackage{graphicx}
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
\begin{document}
\title{A New Statistical Approach to Detecting Activation in fMRI datasets}
\author{Jonathan L. Marchini and Brian D. Ripley}
\address{Department of Statistics, University of Oxford, UK$^\dagger$}
\email{\{marchini,ripley\}@stats.ox.ac.uk}
\makeheader
%% Column 1
\begin{center}
\col{
\paragraph{Introduction}
In the analysis of single-subject fMRI datasets the primary task in a voxel-based analysis is to assign a level of
significance at each voxel. This will depend on the estimation of both the magnitude
of the response to the stimulus and the serial dependence of the
time series, and especially on the assumptions made in that
estimation.
Various methods have been proposed in the literature to do this in
both the time domain and frequency domain. The duality between these
two domains implies that a given method can be computed in either
domain, although this may prove easier in one domain than in the
other. By considering a periodic stimulus design in the frequency
domain we found the analysis to be greatly simplified and at the same
time provided insight into how non-periodic designs could be dealt
with in a similar fashion.
\paragraph{Trend Removal}
Many voxel time series exhibit low frequency trend components. These
may may be due to aliased high frequency physiological components or
drifts in scanner sensitivity. Whatever the cause, these trends tend
to vary non-linearly in time and may result in false positive
activations if they are not accounted for in the model. We have found a simple
running-lines smoother \cite{vr:1999} to be a reliable method of trend removal (see Figure~\ref{trends}).
\begin{figure}
\includegraphics[height=90mm,bb=25 169 579
609]{jm.fig3.eps}
\includegraphics[height=90mm,bb=25 169 579
609]{jm.fig2.eps}
\caption{Trend Removal:(left) Obvious non-linear trend (right) In an area of activation}
\label{trends}
\end{figure}
\paragraph{Periodic designs}
The analysis of experiments in which the stimulus is periodic is greatly simplified if the model for the response is applied using spectral-domain time-series methods. By assuming the design consists of $c$ repeats of a symmetric
baseline-activation block and that the response to the stimulus will contain the majority of it's power at the fundamental frequency of stimulation, $\omega_c = c/\delta n$, a test can be constructed based upon the periodogram ordinate at that frequency. For a series $\{y_t \,:\, t=0,\ldots ,n-1\}$ this is defined as
\begin{eqnarray} \label{periodogram.eqn}
I(\omega_j) & = & n\left| d_y (\omega_j) \right|^2\\ \nonumber
\end{eqnarray}
where
\begin{equation} \label{dft}
d_y(\omega_j)=n^{-1}\sum_{t=0}^{n-1}y_t \exp (-i2 \pi \omega_j\delta t)
\end{equation}
is the discrete Fourier transform of the series $y_t$. It is easily shown that $I(\omega_j)$ is the optimal estimate of the power at Fourier frequency $\omega_j$.
Figure~\ref{periodogram} shows the periodogram of the detrended voxel time series shown in figure~\ref{trends}.
The large spike at the 5th Fourier frequency illustrates how the response has been effectively restricted to this frequency.
\begin{figure}
\includegraphics[height=110mm,bb=25 169 579 609]{jm.fig5.eps}
\caption{Periodogram of the detrended voxel time series shown in figure~\ref{trends} (right). The lines represent spectral density estimates before (thick line) and after (dotted line) spatial smoothing was applied to the spectral estimates.}
\label{periodogram}
\end{figure}
}
%% Column 2
\col{
\paragraph{Theory}
The asymptotic sampling properties of the periodogram are well-known \cite{priestley:1981}.
For increasingly long series:
\begin{itemize}
\item [(I).] $I(\omega_j)/f(\omega_j) \sim \textrm{E}$ where E has a
standard exponential distribution.
\item [(II).] $I(\omega_j)$ and $I(\omega_k)$ are independent for all
$k\ne j$.
\end{itemize}
where $f(\omega)$ is the spectral density of the underlying stationary stochastic process.
This is a very general result, includes AR, MA and ARMA processes as special cases and is known to be accurate for series of moderate
length, such as those encountered in fMRI experiments. If we assume that the underlying spectral
density $f(\omega)$ is smooth then we can use a smoothed version of
$I(\omega)$, which we denote $g(\omega)$, to estimate $f(\omega)$.
\paragraph{Estimating the spectral density}
We use a non-parametric estimate of the spectral density and thus avoid the assumptions implicit in parametric approaches. An estimate of the spectral density is obtained by smoothing the
periodogram on log scale \cite{wahba:1980}, $\log(I(\omega))$, to
ensure that the spectral density estimate is positive everywhere. A smoothing spline \cite{silverman:1985} was chosen to
smooth the log-periodogram. The spline is given more freedom at low frequencies to accommodate the effect of the detrending.
To ensure the spectral density of the noise is estimated independently
of any response to the stimulus, the periodogram ordinates at the
fundamental frequency of stimulation and its first two harmonics are
not included in this procedure. Also, a small amount of spatial smoothing is applied to the spectral density estimates. Figure~\ref{periodogram} shows the spectral density estimates before (thick line) and after (dotted line) spatial smoothing.
\paragraph{Testing for a response to the stimulus}
The spectral density estimate provides us with a baseline against
which to test for significant departures from the underlying process.
From (I),
\begin{equation}
\frac{I(\omega_j)}{f(\omega_j)} \sim E
\label{eq:13}
\end{equation}
Substituting $g(\omega_j)$ for $f(\omega_j)$, we define the ratio
statistic, $R_j$, as
\begin{equation}
R_j = \frac{I(\omega_j)}{g(\omega_j)},\qquad\qquad \omega_j=j/\delta
n
\label{eq:14}
\end{equation}
By calculating the statistic $R_j$ at the fundamental frequency of
activation, $\omega_c$, we obtain a test statistic, $R_c$, for
significant activation. Large values of $R_c$ indicate a large effect at the fundamental
frequency. All of the statistics $R_j$, apart from $j=0$ and $n/2$, will have the same distribution as $R_c$ and thus can be used as a benchmark against which to compare the theoretical distribution of $R_c$, at negligible computational cost. Thus the method is effectively self-calibrating.
\begin{figure}
\includegraphics[height=90mm,bb=100 265 500
590]{jm.fig12.eps}
\includegraphics[height=90mm,bb=100 265 500
590]{jm.fig14.eps}
\caption{$p$-value maps thresholded at $10^{-4}$ obtained using our approach (left) and an AR(1) approach (right)}
\label{vis}
\end{figure}
\begin{figure}
\includegraphics[height=130mm,bb=18 270 570 570]{null.pp.ps}
%\includegraphics[height=90mm,bb=23 140 555 700]{null.pp.ps}
\caption{$PP$-plots for our approach (left) and an AR(1) approach (right) applied to a null dataset}
\label{pp.plots}
\end{figure}
}
%% Column 3
\col{
\paragraph{Results}
We have compared our method to an implementation of the AR(1) approach of \cite{Bullmore:1996} in which we apply our own non-linear detrending. Figure~\ref{vis} shows thresholded $p$-value maps of the same slice from a periodic visual stimulation dataset. The AR(1) approach shows significantly more false-positive activation in areas away from the visual cortex. We also applied the two approaches to a null dataset, assuming a periodic stimulus. The distribution of each statistic was compared to its theoretical form using $PP$-plots shown if figure~\ref{pp.plots}. The left plot shows the $PP$-plot of the $R_c$ statistic (thick line) and the $PP$-plot for $R_j$ statistics at a range of higher frequencies (dotted line). The agreement of the dotted line with the theoretical (straight line at $45^o$) validates the use of the theoretical distribution when using our method. For the AR(1) approach shown in the right plot the theoretical is not an adequate fit and a randomisation experiment will be needed to correct for this.
\paragraph{High frequency artefacts}
In some datasets we have found high frequency artefacts that occur in narrow bands and have been attributed to Nyquist ghosting(figure~\ref{ghost} (right)). We have detected these artefacts using our values of $R_j$ at high frequencies (figure~\ref{ghost} (left) shows a thresholded $R_{88}$ image).
\begin{figure}
\includegraphics[height=80mm,bb=100 265 500 590]{jm.fig19.eps}
\includegraphics[height=90mm,bb=90 210 470 600]{jm.fig17.eps}
\caption{}
\label{ghost}
\end{figure}
We have found that parametric-time domain models may be extremely susceptible to these artefacts whereas non-parametric spectral density estimation will be resistant. Three methods were applied to a voxel in an area exhibiting the high-frequency artefact. Method I: The AR(1) approach described above, Method II: As Method I but with high frequency artefact removed, Method III: Our approach. Comparing Method I and II, in the table below, we see that the estimated parameters are significantly different after the artefact has been removed, which results in a misleading statistic. Even after removal, there is no guarantee that the AR(1) model is flexible enough and this results in the difference between Methods II and III.
\begin{center}
\small
\begin{tabular}{|l||c|c|c|c|c|}
\hline
& AR(1) coefficient & $\widehat{\sigma^2}$ & Numerator & Denominator & Ratio Statistic\\
\hline \hline
Method I & $-$0.0027 & 15765 & 591.88 & 315.29 & 1.877\\
\hline
Method II & 0.2839 & 11276 & 624.98 & 419.46 & 1.490\\
\hline
Method III& --- & --- & 689.30 & 513.88 & 1.341\\
\hline
\end{tabular}
\end{center}
\paragraph{Conclusion and extensions}
Non-parametric spectral estimation is shown to be an accurate and self-calibrating approach for analyzing periodic designs. The method makes few assumptions and is resistant to high-frequency artefacts whereas parametric time-domain approaches may be susceptible to these artefacts and biased by the assumptions they make on the form of the spectral density. The method can be easily extended to handle non-periodic event related designs and initial results are extremely promising.
\paragraph{Acknowledgements} We are grateful to Dr Stephen Smith (fMRIB Centre, Oxford) for
advice and datasets.
\references
\bibliography{jm.ref}
}
\end{center}
\makefooter
\end{document}