At the International Conference on Memory last month, I presented some new work from my lab, a 21 participant, fMRI-scanned, memory experiment. We imaged people’s brains as they underwent a procedure that generates sensations likened to déjà vu (based on Josie Urquhart’s procedure, published in 2014, that you can find here). What makes this work particularly exciting is that, to our knowledge, this is the first time people undergoing an experimental analogue of déjà vu have been imaged. It lead to some pretty neat results.

Memory conflict-related brain regions that track deja vu reports
Memory conflict-related brain regions that track deja vu reports

The findings were picked up by New Scientist and are summarised in the piece below:
https://www.newscientist.com/article/2101089-mystery-of-deja-vu-explained-its-how-we-check-our-memories/

Embedded within that article is the following video, which distills the essence of what we’re excited about – brain regions associated with memory conflict, rather than false memory, appear to be driving the déjà vu experience. This is consistent with our idea of deja vu as the conscious awareness of a discrepancy in memory signals being corrected. This in turn sheds some light on why déjà vu occurrence appears to decline with age despite the fact that memory errors tend to increase with age. If it’s not an error, but the prevention of an error, this makes a lot more sense.

[17/08/2016 UPDATE]:

A few other news organisations have since reported the story:

BBC World Service Newshour (interview, audio below)


Absolute Radio
 (interview, audio below: 42.55 – 48.02)


IFLScience
 (text, rejigged NS article)

New York Magazine (text, neat explanation of the paradigm)

[18/08/2016 UPDATE]:

Digital Trends (text, one of the only online news organisations to speak to me in person)

Daily Mail (text, unnecessarily scary headline, lots of lovely comments :/ )

Gizmodo (text, more hyperbole)

[19/08/2016 UPDATE]:

Medical Daily (text)

New.com.au (text, emphasises importance of peer review to come)

I often bang on about how useful twitter is for crowd-sourcing a research community. Today I was reminded of quite brilliant the people on twitter can be at helping to overcome an ‘I don’t know where to start’-type information problem.

I’m currently helping to design an fMRI study which could benefit considerably from the application of multivoxel pattern analysis (MVPA). Having no practical experience with MVPA means I’m trying to figure out what I need to do to make the MVPA bit of the study a success. After a few hours of searching, I have come across and read a number of broad theoretical methods papers, but nothing that gives me the confidence that anything I come up with will be viableOf course, there’s no right way of designing a study, but there are a tonne of wrong ways, and I definitely want to avoid those.

So, I turned to twitter:

Relays and Retweets from @hugospiers, @zarinahagnew and @neuroconscience led to the following tweets coming my way (stripped of @s for ease of reading… kind of).

Sure, I could have come up with as many articles to read by typing “MVPA” into Google Scholar (as I have done in the past), but the best thing about my twitter-sourced reading list is that I’m confident it’s pitched at the right level.

I’m humbled by how generous people are with their time, and glad so many friendly academics are on twitter. I hope collegiality and friendliness like this encourages many more to join our ranks.

Below are some quick-and-dirty brain outline images I’m using in a talk I’m giving in a couple of weeks. I like the calligraphic quality that the axial and sagittal slices have. The coronal image is a little more colouring-book in its outline.

axial outline
sagittal outline
coronal outline

They’re very easily generated with screengrabs from MRIcron that are processed in GIMP with a straightforward series of the following steps:

1) Edge-detect
2) Invert Colours
3) Gaussian Blur
4) Brightness-Contrast

Repeating steps 3 and 4 a couple of times will get the consistency of line seen in the coronal image.

The past few days have seen my fMRI analysis server taken over by a Linux virtual machine. I have installed FSL, and have been using MELODIC to plough my way through ICA analyses of fcMRI data, a first for me.

One of the annoyances I have had to deal with as part of this project has been the difference in input data required for SPM, for which my preprocessing stream is targeted, and FSL, for which it is not. Specifically, this difference has necessitated the conversion of data runs from 3D NIFTI files to a single 4d NIFTI file. FSL has a utility for this (fslmerge), but being the Linux novice that I am, I have struggled to script the merging within the virtual machine.

Thankfully, SPM has a semi-hidden utility for this conversion.

SPM's 3d to 4d NIFTI conversion tool
SPM's 3d to 4d NIFTI conversion tool

The GUI is located within the Batch Editor’s SPM>Util menu, and be default saves the specified 3D NIFTI images to a single 4D NIFTI image within the same directory. It doesn’t gzip the output image, like the fslmerge script, but, it’s scriptable using the ‘View>Show .m Code’ menu option, and it’s good enough for me.

 

Since setting up lab in St Andrews I’ve consistently run into a DICOM Import Error that causes the process to terminate about half-way through. I finally fixed the problem today after a quick search on the SPM mailing list.

The error I was receiving was as follows:

Running ‘DICOM Import’
Changing directory to: D:Akira Cue Framing 2011PP03
Failed ‘DICOM Import’
Error using ==> horzcat
CAT arguments dimensions are not consistent.
In file “C:spm8spm_dicom_convert.m” (v4213), function “spm_dicom_convert” at line 61.
In file “C:spm8configspm_run_dicom.m” (v2094), function “spm_run_dicom” at line 32.

The following modules did not run:
Failed: DICOM Import

??? Error using ==> cfg_util at 835
Job execution failed. The full log of this run can be found in MATLAB command window, starting with the lines (look for the line
showing the exact #job as displayed in this error message)
——————
Running job #[X]
——————

Error in ==> spm_jobman at 208

??? Error while evaluating uicontrol Callback

This was a little mysterious, as the appropriate number of nifti files appeared to be left after the process terminated unexpectedly.

The following link suggested an SPM code tweak that might fix it:
https://www.jiscmail.ac.uk/cgi-bin/webadmin?A2=ind1106&L=SPM&P=R49499&1=SPM&9=A&J=on&d=No+Match%3BMatch%3BMatches&z=4

The proposed fix from John Ashburner simply requires changing line 61 of spm_dicom_convert.m from:

out.files = [fmos fstd fspe];

to:

out.files = [fmos(:); fstd(:); fspe(:)];

Works like a charm!

 

Out with the old, in with the new: Novelty judgements as a translational tool to assess healthy ageing

Funding Notes:  The student will require a minimum of an upper second class Hons Degree and is open to UK and European Community students. Fees and stipend are covered for UK students. EU students can get fees-only covered if they have not studied in UK and can get full stipend and fees if they have studied here for 3 or more years.

Application Deadline: 31st January, 2012

Supervisors: Drs James Ainge and Akira O’Connor, University of St Andrews, with Dr Rosamund Langston, University of Dundee

Project Description: How we experience memory shapes how we experience the world. Just as learning to trust our memories in childhood empowers us to explore our surroundings, learning that our memories have become untrustworthy leads to reduced independence and diminished quality of life. The deleterious effects of memory decline are most often associated with ageing in older adulthood. Although memory decline is commonly conceptualised as a reduced ability to detect ‘oldness’, we will build on recent advances across multiple fields to explore age-related memory decline as a deficit in ‘novelty’ detection (refs. 1-3). The proposed project will combine state of the art in vivo behavioural neuroscience techniques with neuropsychology and functional neuroimaging to explore the ability to detect novel stimuli across the lifespan of both rodents and humans. This multidisciplinary approach will examine how the behavioural consequences of ageing (e.g. reduced environmental interaction) are driven by age-related changes to the structure and function of memory systems across species. This combined approach will provide excellent training for the student in a variety of techniques, particularly strategically important in vivo skills, to enable a systems level understanding of memory mechanisms and how they degrade with normal ageing.

References:
Burke SN, Wallace JL, Nematollah S, Uprety AR & Barnes CA (2010). Pattern separation deficits may contribute to age- associated recognition impairments. Behavioral Neuroscience, 124, 559-73.
O’Connor AR, Lever C & Moulin CJA (2010). Novel insights into false recollection: A model of déjà vécu. Cognitive Neuropsychiatry, 15, 118-144.
O’Connor AR, Guhl EN, Cox JC, Dobbins IG (2011). Some memories are odder than others: Judgements of episodic oddity violate known decision rules. Journal of Memory and Language, 64, 299-315.

Enquiries by e-mail to James AingeAkira O’Connor or Rosamund Langston.

Whilst Windows easily copies lots of data, it struggles when you ask it to copy lots and lots and lots of data.  Teracopy is a neat file copying utility that provides peace of mind as you transition from copying gigabytes of data to terabytes of data.

In order to get my fMRI data from St. Louis to St. Andrews, I have embarked upon the somewhat arduous task of copying everything to a portable hard-drive.   After a few attempts that ended in the failure to copy a file or two, seemingly at random, I lost faith in using the standard drag-and-drop copy in Windows, and searched for alternatives.  The command line option seemed fastest, but I didn’t want to bring the server down for everyone else for a few hours whilst I did my copying.  Then I found Teracopy.

A picture of TeraCopy
Teracopy in action (Image via Wikipedia)

Teracopy (freeware) is a straightforward utility that improves upon the Windows interface in a number of ways.  Copying is (apparently) faster and it certainly seems more reliable than the standard Windows approach.  One very nice feature is that it allows you to pause and resume your copying for when you need to free up system resources temporarily.

download Teracopy

So far I have copied close to a terabyte of data onto my portable hard-drive with no problems.  Now all that remains is to check it all with another utility (Windiff) to make sure all my files really did get copied successfully, and to actually transport my hard-drive without banging or dropping it.

Typical fMRI brain scans take a 3D images of the head every few seconds.  These images are composed of lots of 2D ‘slices’ (usually axially oriented) stacked on top of one another.  This is where the problem of slice acquisition time rears its head – the problem being that these slices are not all taken at the same time, in fact, their collection tends to be distributed uniformly over the duration it takes to gather a whole 3D image.  Therefore, if you are collecting a 3D image comprising 36 slices every 2 seconds, you will have a different slice collected every 1/18th of a second.

2D slices (left; presented as a mosaic), acquired at slightly different times within a 2s TR, that make up a typical 3D image in fMRI (right; shown with a cutout)

If you’re worried about the effect of this fuzziness in temporal resolution on your data (and there are those who don’t), then it can be corrected for in the preprocesisng stages of analysis.  Of course, you do need to know the order in which your slices were collected to correct for the ordering differences.

Finding out the order of slice correction is not as easy as it should be.  On the Siemens Trio scanner that I use, it’s straightforward if you have an ‘ascending’ (bottom to top, in order: 1, 2, 3, etc.) or a ‘descending’ (top to bottom, in order, 36, 35, 34 etc.) order of slice collection.  However, if you’re using the ‘interleaved’ order (odd slices collected first, followed by even slices), it’s not immediately clear whether you’re doing that in an ascending (1, 3 5… 2, 4 6… etc.) or descending (35, 33, 31… 36, 34, 32… etc.) interleaved order.

I found out that I was collecting my slices in an interleaved, ascending order by asking the MR technician at the facility.  But, if there was no technician to hand, or if I wanted to verify this order myself, I would be very tempted to try out a method I found out about on the SPM list today:

Head-turning research (links to poster at a readable resolution on the University of Ghent web-site)

The procedure, devised by Descamps and colleagues, simply involves getting an fMRI participant to turn their head from looking straight up, to looking to one side during a very short scan.  The turn should be caught in its various stages of completion by the various slices that comprise one 3D image, allowing the curious researcher to figure out the slice acquisition order crudely, but effectively.

I enjoyed how connected to the physical reality of our own bodies this procedure is.  It reminded that these tools we are using to make inference about cognition are tied to our bodies in a very tangible way.  That is something I often forget when pushing vast arrays of brain-signals values around in matrices, so it’s nice to be reminded of it now and again – I’d certainly rather be reminded like this, than by having to discard a participant’s data because they have moved so much during the a scan as to make their data useless!

Below (indented) is a straightforward MATLAB/SPM/Marsbar script for generating separate Marsbar ROI and Nifti ROI spheres (user-defined radius) from a set of user-defined MNI coordinates stored in a text file (‘spherelist.txt’ saved in the same directory as the script).  ROI files are saved to mat and img directories that the script creates.
I use this script to generate *.mat files as seeds for resting connectivity analyses.  Once a *.mat ROI is generated, Marsbar can be used to extract raw timecourses from this region to feed into connectivity analysis as the regressor of interest.  Because MRIcron doesn’t read *.mat Marsbar ROI files, I render the equivalent *.img seed regions on a canonical brain when I need to present them graphically.

% This script loads MNI coordinates specified in a user-created file,
% spherelist.txt, and generates .mat and .img ROI files for use with
% Marsbar, MRIcron etc.  spherelist.txt should list the centres of
% desired spheres, one-per-row, with coordinates in the format:
% X1 Y1 Z1
% X2 Y2 Z2 etc
% .mat sphere ROIs will be saved in the script-created mat directory.
% .img sphere ROIs will be saved in the script-created img directory.
% SPM Toolbox Marsbar should be installed and started before running script.

% specify radius of spheres to build in mm
radiusmm = 4;

load(‘spherelist.txt’)
% Specify Output Folders for two sets of images (.img format and .mat format)
roi_dir_img = ‘img’;
roi_dir_mat = ‘mat’;
% Make an img and an mat directory to save resulting ROIs
mkdir(‘img’);
mkdir(‘mat’);
% Go through each set of coordinates from the specified file (line 2)
spherelistrows = length(spherelist(:,1));
for spherenumbers = 1:spherelistrows
% maximum is specified as the centre of the sphere in mm in MNI space
maximum = spherelist(spherenumbers,1:3);
sphere_centre = maximum;
sphere_radius = radiusmm;
sphere_roi = maroi_sphere(struct(‘centre’, sphere_centre, …
‘radius’, sphere_radius));

% Define sphere name using coordinates
coordsx = num2str(maximum(1));
coordsy = num2str(maximum(2));
coordsz = num2str(maximum(3));
spherelabel = sprintf(‘%s_%s_%s’, coordsx, coordsy, coordsz);
sphere_roi = label(sphere_roi, spherelabel);

% save ROI as MarsBaR ROI file
saveroi(sphere_roi, fullfile(roi_dir_mat, sprintf(‘%dmmsphere_%s_roi.mat’,…
radiusmm, spherelabel)));
% Save as image
save_as_image(sphere_roi, fullfile(roi_dir_img, sprintf(‘%dmmsphere_%s_roi.img’,…
radiusmm, spherelabel)));
end

UPDATE: WordPress messed with the characters in the above script, so here is a link to the script file and an example spherelist.txt file.

Only 80% of scheduled participants produce datasets where data from all runs is usable.  That’s the conclusion I have drawn from my limited experience of scanning participants for research here at Washington University.

My running totals
Study 1: Usable data from 19 participants  of 24 booked – 79%
Study 2: Usable data from 14 participants of 17 booked –  82%
Study 3 (so far): Usable data  from 6 participants of 8 booked – 75%

That’s 39 from 49, approx. 80% overall.

Reasons for unusable data include script and scanner problems, participants performing at or below chance, participants falling asleep, participants needing to end the experiment early, and participants failing to show up at all.  The participant no-show scenario isn’t really too much of a problem if you are billed only for the time used on the scanner (which is what happens here) though it is rearing its ugly head for me as I am coming to the end of my time here at Washington University – every absent participant reduces my sample size by one.

All of which means, the 20 slots I was counting on for Study 3 should yield a sample of 16 – I reckon that’s on the low end of fine, but still fine.  We’ll see.