Back to the main page.
Bug 2734 - initial CIFTI io tests fail
Status | REOPENED |
Reported | 2014-10-15 11:40:00 +0200 |
Modified | 2015-06-11 23:19:00 +0200 |
Product: | FieldTrip |
Component: | fileio |
Version: | unspecified |
Hardware: | PC |
Operating System: | Mac OS |
Importance: | P5 normal |
Assigned to: | Robert Oostenveld |
URL: | |
Tags: | |
Depends on: | |
Blocks: | |
See also: | http://bugzilla.fcdonders.nl/show_bug.cgi?id=2096 |
Andre Marquand - 2014-10-15 11:40:49 +0200
Created attachment 665 debug.xml file On one of the smaller machines (mentat206) I ran out of memory: x = ft_read_cifti('tstat1.dtseries.nii') Error using nan Out of memory. Type HELP MEMORY for your options. Error in ft_read_cifti (line 681) dat = nan(Ngreynodes,Ntime); And on one of the other machines (dccn-c011), I encountered the following error. x = ft_read_cifti('tstat1.dtseries.nii') Subscripted assignment dimension mismatch. Error in ft_read_cifti (line 682) dat(greynodeIndex(dataIndex),:) = transpose(data); The data I am trying to load are one of the t-statistic images from one of the tfMRI contrasts, which you can (for example) find in: /home/mrstats/andmar/data/hcp/data_unpacked/100307/MNINonLinear/Results/tfMRI_LANGUAGE/tfMRI_LANGUAGE_hp200_s8_level2.feat/GrayordinatesStats/cope5.feat These are not overly large files and I am currently manipulating many of them simultaneously in memory using my current routines (SPM gifti + nifti routines). I have also attached a debug.xml file which was generated as a part of this process. I hope that helps and of course I am happy to run additional tests to help streamline the code.
Robert Oostenveld - 2014-10-15 11:45:42 +0200
I have copied tstat1.dtseries.nii to /home/common/matlab/fieldtrip/data/test/bug2734 and will test it out.
Robert Oostenveld - 2014-10-15 12:01:15 +0200
(In reply to Robert Oostenveld from comment #1) with 16gb requested it gets killed with 32gb requested it gets killed but in both cases I was (just) able to see the error. I'll move to a 64gb session for diagnosis.
Robert Oostenveld - 2014-10-15 15:04:25 +0200
I found and fixed the problem. It was due to an incorrect reshape, causing the number of time points to become equal to the number of brainordinates. Subsequently it was trying to allocate a 96854*96854 matrix, which is a 70GB matrix. mac011> svn commit Sending fileio/ft_read_cifti.m Adding test/test_bug2734.m Transmitting file data .. Committed revision 9902.
Robert Oostenveld - 2015-02-11 10:40:38 +0100
Closed several bugs that were recently resolved. Please reopen if you are not happy with the resolution.
Andre Marquand - 2015-02-11 13:34:06 +0100
I'm afraid the cifti routines still do not work with MRI data. The subcortical data are not recognised by connectome workbench after writing
Robert Oostenveld - 2015-02-12 10:32:48 +0100
(In reply to Andre Marquand from comment #5) Hi Andre, for MEG we don't look at subcortical structures that often, so we don't have combined data. Could you provide an example dataset/file, or an example script that allows us to reproduce? That will save us quite some time. Robert
Andre Marquand - 2015-02-16 14:07:06 +0100
(In reply to Robert Oostenveld from comment #6) Hi Robert, You can use this file: /home/mrstats/andmar/data/hcp/data_unpacked/100307/MNINonLinear/Results/tfMRI_GAMBLING/tfMRI_GAMBLING_hp200_s8_level2.feat/GrayordinatesStats/cope1.feat/cope1.dtseries.nii if you load the image then save it out (without modifying anything), then try to view the volumetric components in workbench, you should be able to recreate the problem. Thanks, Andre.