The past few days have seen my fMRI analysis server taken over by a Linux virtual machine. I have installed FSL, and have been using MELODIC to plough my way through ICA analyses of fcMRI data, a first for me.

One of the annoyances I have had to deal with as part of this project has been the difference in input data required for SPM, for which my preprocessing stream is targeted, and FSL, for which it is not. Specifically, this difference has necessitated the conversion of data runs from 3D NIFTI files to a single 4d NIFTI file. FSL has a utility for this (fslmerge), but being the Linux novice that I am, I have struggled to script the merging within the virtual machine.

Thankfully, SPM has a semi-hidden utility for this conversion.

SPM's 3d to 4d NIFTI conversion tool
SPM's 3d to 4d NIFTI conversion tool

The GUI is located within the Batch Editor’s SPM>Util menu, and be default saves the specified 3D NIFTI images to a single 4D NIFTI image within the same directory. It doesn’t gzip the output image, like the fslmerge script, but, it’s scriptable using the ‘View>Show .m Code’ menu option, and it’s good enough for me.

 

Do not duplicate

I ran into the following error when trying to use a script to make Marsbar extract betas

Error in pr_stat_compute at 34

Indices too large for contrast structure

This problem occurred when I was trying to extract the betas for an unusual participant who had an empty bin for one condition, and for whom I had therefore had to manually alter the set of contrasts.  In doing this, it turns out that I had inadvertently duplicated one contrast vector.  Although the names were different, the number of contrasts had been amended to reflect the number of unique contrast vectors in SPM.xCon but not in Marsbar’s D, meaning that pr_stat_compute’s ‘xCon = SPM.xCon’ (line 23), did not return the same value as my own script’s ‘xCon = get_contrasts(D)’.  This was causing the two xCons to differ in their length and the resultant error in pr_stat_compute.

The solution lay in removing the duplicate contrasts from the contrast specification for that participant.

Since setting up lab in St Andrews I’ve consistently run into a DICOM Import Error that causes the process to terminate about half-way through. I finally fixed the problem today after a quick search on the SPM mailing list.

The error I was receiving was as follows:

Running ‘DICOM Import’
Changing directory to: D:Akira Cue Framing 2011PP03
Failed ‘DICOM Import’
Error using ==> horzcat
CAT arguments dimensions are not consistent.
In file “C:spm8spm_dicom_convert.m” (v4213), function “spm_dicom_convert” at line 61.
In file “C:spm8configspm_run_dicom.m” (v2094), function “spm_run_dicom” at line 32.

The following modules did not run:
Failed: DICOM Import

??? Error using ==> cfg_util at 835
Job execution failed. The full log of this run can be found in MATLAB command window, starting with the lines (look for the line
showing the exact #job as displayed in this error message)
——————
Running job #[X]
——————

Error in ==> spm_jobman at 208

??? Error while evaluating uicontrol Callback

This was a little mysterious, as the appropriate number of nifti files appeared to be left after the process terminated unexpectedly.

The following link suggested an SPM code tweak that might fix it:
https://www.jiscmail.ac.uk/cgi-bin/webadmin?A2=ind1106&L=SPM&P=R49499&1=SPM&9=A&J=on&d=No+Match%3BMatch%3BMatches&z=4

The proposed fix from John Ashburner simply requires changing line 61 of spm_dicom_convert.m from:

out.files = [fmos fstd fspe];

to:

out.files = [fmos(:); fstd(:); fspe(:)];

Works like a charm!

 

SPM will, by default, show you 3 local maxima within each cluster displayed when you click ‘whole brain’ within Results.  To change the default number of local maxima displayed in the output table, edit spm_list.m and replace the variable ‘Num’ (line 201 in the spm_list.m supplied with SPM8). I currently have it set to 64.

You can also edit the variable ‘Dis’ in the same .m file (line 202 in SPM8) to change the minimum distance between peak voxels displayed.

Below (indented) is a straightforward MATLAB/SPM/Marsbar script for generating separate Marsbar ROI and Nifti ROI spheres (user-defined radius) from a set of user-defined MNI coordinates stored in a text file (‘spherelist.txt’ saved in the same directory as the script).  ROI files are saved to mat and img directories that the script creates.
I use this script to generate *.mat files as seeds for resting connectivity analyses.  Once a *.mat ROI is generated, Marsbar can be used to extract raw timecourses from this region to feed into connectivity analysis as the regressor of interest.  Because MRIcron doesn’t read *.mat Marsbar ROI files, I render the equivalent *.img seed regions on a canonical brain when I need to present them graphically.

% This script loads MNI coordinates specified in a user-created file,
% spherelist.txt, and generates .mat and .img ROI files for use with
% Marsbar, MRIcron etc.  spherelist.txt should list the centres of
% desired spheres, one-per-row, with coordinates in the format:
% X1 Y1 Z1
% X2 Y2 Z2 etc
% .mat sphere ROIs will be saved in the script-created mat directory.
% .img sphere ROIs will be saved in the script-created img directory.
% SPM Toolbox Marsbar should be installed and started before running script.

% specify radius of spheres to build in mm
radiusmm = 4;

load(‘spherelist.txt’)
% Specify Output Folders for two sets of images (.img format and .mat format)
roi_dir_img = ‘img’;
roi_dir_mat = ‘mat’;
% Make an img and an mat directory to save resulting ROIs
mkdir(‘img’);
mkdir(‘mat’);
% Go through each set of coordinates from the specified file (line 2)
spherelistrows = length(spherelist(:,1));
for spherenumbers = 1:spherelistrows
% maximum is specified as the centre of the sphere in mm in MNI space
maximum = spherelist(spherenumbers,1:3);
sphere_centre = maximum;
sphere_radius = radiusmm;
sphere_roi = maroi_sphere(struct(‘centre’, sphere_centre, …
‘radius’, sphere_radius));

% Define sphere name using coordinates
coordsx = num2str(maximum(1));
coordsy = num2str(maximum(2));
coordsz = num2str(maximum(3));
spherelabel = sprintf(‘%s_%s_%s’, coordsx, coordsy, coordsz);
sphere_roi = label(sphere_roi, spherelabel);

% save ROI as MarsBaR ROI file
saveroi(sphere_roi, fullfile(roi_dir_mat, sprintf(‘%dmmsphere_%s_roi.mat’,…
radiusmm, spherelabel)));
% Save as image
save_as_image(sphere_roi, fullfile(roi_dir_img, sprintf(‘%dmmsphere_%s_roi.img’,…
radiusmm, spherelabel)));
end

UPDATE: WordPress messed with the characters in the above script, so here is a link to the script file and an example spherelist.txt file.

Here’s an interesting wikibooks page detailing how you can make SPM faster.

http://en.wikibooks.org/wiki/SPM/Faster_SPM

Some of the tweaks involve simply adjusting the spm_defaults.m to utilise the amount of RAM you have installed at the model estimation stage.  Others involve a more costly (and potentially hugely beneficial?) purchase of the Parallel Computing Toolbox to utilise many cores in a single machine, or many machines served by a server.  I’ll certainly be taking a look at these tweaks in the coming weeks and months.

EDIT: Changing the defaults.stats.maxmem parameter from its default value of 20 to 33 (in order to use a maximum of 8GB of available memory; as outlined in the screengrab from the wikibooks site below) looks to have sped model estimation up by maybe a factor of 10.

A defaults variable 'maxmem' indicates how much memory can be used at the same time when estimating a model. If you have loads of memory, you can increase that memory setting in spm_defaults.m

Assuming you have a large amount of RAM to utilise, this is a HUGE time-saving tweak.  Even if you don’t have a large amount of RAM, I’m sure you can speed things up considerably by specifying the value as something greater than the meagre 1MB (2^20) SPM allocates by default.

SEE ALSO: https://www.jiscmail.ac.uk/cgi-bin/webadmin?A2=SPM;863a049c.1105 which has the following recommendation:

[change] line 569 in spm_spm from:

nbz = max(1,min(zdim,floor(mmv/(xdim*ydim)))); nbz = 1; %-# planes

to:

nbz = max(1,min(zdim,floor(mmv/(xdim*ydim)))); %-# planes

[i]n order to load as much data as possible into the RAM at once.

By default, SPM masks the images that contribute to an analysis at the Estimation stage.  If a voxel is masked out because it fails to exceed an arbitrary analysis threshold (set to a defulat of 0.8 in SPM5), then its values are replaced with NaNs, and that voxel does not contribute to the final output.  Incidentally, this masking contributes to the non-analysis of orbitofrontal and frontopolar regions as a consequence of signal dropout.

If you want to include voxels that do not exceed the threshold (useful if you are interesting in analysing data presented in unusual units, e.g. maps of residuals), you can edit the spm_defaults.m file.  Around line 42 should be the following text:

% Mask defaults
%=======================================================================
defaults.mask.thresh    = 0.8;
This can be edited (e.g. replaced with -Inf if you want to remove the threshold altogether), the spm_defaults.m file saved, and the analysis run with a more liberal masking threshold implemented.  This can greatly increase the number of comparison that are made, and can include a lot of computationally expensive junk i.e. comparisons with non-brain tissue.  To get round this issue, it is worthwhile setting an explicit mask in the model specification stage (e.g. the whole brain mask I wrote about here) whenever you lower the implicit masking threshold.

There is a little more on this from the SPM list here.  As with all SPM tweaks, make note of what you have tweaked, and make sure you change it back to its default setting once you have done what you set out to do.

Occasionally, it’s nice to look under the bonnet and see what’s going on during any automated process that you take for granted.  More often than not, I do this when the automaticity has broken down and I need to fix it (e.g. my computer won’t start), or if I need to modify the process in a certain way as to make its product more useful to me (e.g. installing a TV card to make my computer more ‘useful’ to me).  This is especially true with tools such as SPM.

One of the greatest benefits associated with using SPM is that it’s all there, in one package, waiting to be unleashed on your data.  You could conduct all of your analyses using SPM only, and you could never need to know how SPM makes the pretty pictures that indicate significant brain activations according to your specified model.  That’s probably a bad idea.  You, at least, need to know that SPM is conducting lots and lots of statistical tests – regressions – as discussed in the previous post.  If you have a little understanding of regressions, you’re then aware that what isn’t fit into your regression model is called a ‘residual’ and there are a few interesting things you can do with residuals to establish the quality of the regression model you have fit to your data.  Unfortunately with SPM, this model fitting happens largely under the bonnet, and you could conduct all of your analyses without ever seeing the word ‘residual’ mentioned anywhere in the SPM interface.

Why is this?  I’m not entirely sure.  During the process of ‘Estimation’, SPM writes an image containing all of your residuals to disk (in the same directory as the to-be-estimated SPM.mat file) in a series of image files as follows:

ResI_0001.img ResI_0001.hdr
ResI_0002.img ResI_0002.hdr
ResI_0003.img ResI_0003.hdr

ResI_xxxx.img ResI_xxxx.hdr
(xxxx corresponds to the number of scans that contribute to the model.)

Each residual image will look something like this when displayed in SPM. You can see from the black background that these images are necessarily subject to the same masking as the beta or con images.

SPM then deletes these images once estimation is complete, leaving you having to formulate a workaround if you want to recover the residuals for your model.  One reason SPM deletes the residual image files is that they take up a lot of disk space – the residuals add nearly 400MB (in our 300 scan model) for each participant which is a real pain if you’re estimating lots of participants and lots of models.

If you’re particularly interested in exploring the residual images (for instance, you can extract the timecourse of residuals for the entire run from an ROI using Marsbar), you need to tweak SPM’s code.  As usual, the SPM message-board provides information on how to do this.

You can read the original post here, or see the relevant text below:

… See spm_spm.m, searching for the text “Delete the residuals images”.  Comment out the subsequent spm_unlink lines and you’ll have the residual images (ResI_xxxx.img) present in the analysis directory.
Also note that if you have more than 64 images, you’ll also need to change spm_defaults.m, in particular the line
defaults.stats.maxres   = 64;
which is the maximum number of residual images written.
There are a few steps here:
1) open spm_spm.m for editing by typing
>> edit spm_spm
2) Find the following block of code (lines 960-966 in my version of SPM5):
%-Delete the residuals images
%==========================================================================
for  i = 1:nSres,
spm_unlink([spm_str_manip(VResI(i).fname,’r’) ‘.img’]);
spm_unlink([spm_str_manip(VResI(i).fname,’r’) ‘.hdr’]);
spm_unlink([spm_str_manip(VResI(i).fname,’r’) ‘.mat’]);
end
and comment it out so it looks like:
%-Delete the residuals images
%==========================================================================
%for  i = 1:nSres,
%    spm_unlink([spm_str_manip(VResI(i).fname,’r’) ‘.img’]);
%    spm_unlink([spm_str_manip(VResI(i).fname,’r’) ‘.hdr’]);
%    spm_unlink([spm_str_manip(VResI(i).fname,’r’) ‘.mat’]);
%end
3) open spm_defaults.m for editing by typing

>> edit spm_defaults

4) Find the following line (line 35 in my version of SPM5):

defaults.stats.maxres   = 64;

and change to:

defaults.stats.maxres   = Inf;

5) Save both files and run your analysis.

Make sure that once you no longer need to see the residual images, you unmodify the code, otherwise you’ll run out of harddisk-space very very quickly!

I recently came across a web-page I should have committed to memory years ago, when I was first starting to get to grips with SPM analysis:

Matthew Brett’s Introduction to SPM Statistics

It’s a fantastically straightforward guide to how SPM analysis uses a regression model defined by your contrasts to establish the voxels in your scanner images that have significant activations.

It doesn’t take too much understanding on top of what you get from this web-page to appreciate that when you’re specifying that you want onsets modeled as a haemodynamic response function (hrf), the software is simply building a timecourse by stacking hrfs on top of one another according to your design-defined onsets.  It them fits the regression which is now defined by parameters resulting from the estimated hrf timecourse rather than task difficulty values from 1-5.

I’d say this should be required reading for all those getting to grips with SPM.

Whole brain masks are produced by SPM when estimating a model.  They’re great to look over if you want to check  the extent of participant movement (a quick heuristic is to examine whether movement has been so severe that it has noticeably chopped off bits of the brain, e.g. the cerebellum).

These masks can also be used as large, whole-brain ROIs from which to extract signal to covary out of resting connectivity analyses.  I’ll write more about conducting resting connectivity analyses using SPM, without the need for a dedicated connectivity toolbox, at a later date, but it involves extracting timecourses from the whole brain, white matter, CSF and entering these as nuisance regressors alongside movement paramters and their first derivatives.  I use Marsbar to extract the timecourses from the ROI files saved in the *roi.mat format.

Recently, when combining a few different datasets into one bank of resting connectivity data, I noticed that the whole brain mask aggregated across the large number of participants was dropping out a lot of the brain – not enough to consider excluding individual participants, but cumulatively quite deleterious for the overall mask.  I therefore used Imcalc to generate a binary-thresholded image (thresholded at 0.2) of the SPM-bumdled EPI template.  As you can see below, once you remove the eyeballs, this makes for a nice whole-brain mask.

whole-brain mask image
Whole-brain mask constructed from SPM EPI template

I’ve zipped this mask and made available in roi.mat and .nii format here.