Back to the main page.
Bug 2276 - implement/wrap around neuroconn import
Status | CLOSED FIXED |
Reported | 2013-09-09 09:20:00 +0200 |
Modified | 2014-03-12 12:20:26 +0100 |
Product: | FieldTrip |
Component: | fileio |
Version: | unspecified |
Hardware: | PC |
Operating System: | Windows |
Importance: | P3 normal |
Assigned to: | Jörn M. Horschig |
URL: | http://www.neuroconn.de |
Tags: | |
Depends on: | |
Blocks: | |
See also: |
Jörn M. Horschig - 2013-09-09 09:20:48 +0200
During the TMS-EEG workshop, I was approached by an engineer from neuroconn (Dr. Falk Schlegelmilch). He asked what is necessary to make FieldTrip able to import neuroconn data. I told him that it might be a good idea to send us the (already available) Matlab low-level function of reading in those files, and we can make ft_read_XXX functions wrap around this. He seemed willing to help if necessary, but imho it would be best if we take care of the high-level functions, while relying on the companies' low-level function. I did not receive the import-function by mail yet, but I'll contact him today
Robert Oostenveld - 2013-09-09 11:06:41 +0200
you might want to add him to this bugzilla thread.
Falk Schlegelmilch - 2013-10-15 14:43:28 +0200
MATLAB files for low-level access to NEURO PRAX data (*.EEG) are available. ------------------------------------ np_readfileinfo.m - reads general information of the EEG header (filename, electrode setup, patient's name, sampling frequency, channel list, a.s.o.) np_readdata.m - reads a block of data from the *.EEG file np_readmarker.m - reads the sample indices of all markers (set by function key during the recording) and external events (digital trigger inputs at the amplifier) in the *.EEG file ------------------------------------- neuroConn will send data examples to Jörn Horschig. We have to discuss the integration into Fieltrip.
Jörn M. Horschig - 2013-10-15 16:30:38 +0200
thanks, I received the m-files as well the datafile. In FieldTrip, we use the same high-level function for reading in header information, data and events independent of the data format. These functions are called ft_read_header, ft_read_data and ft_read_event, respectively. The body of the function is basically a switch-statement on the dataformat and then wrapping around the dataformat specific reading functions. Most likely, we will do the same with your dataformat. I will discuss tomorrow with Robert on how to best implement the wrapping and what to take care of, as he is way more experienced with importing data into FieldTrip (and the head of the project). Most likely though, I will take care of the wrapping and the documentation and contact you if necessary (or if I am done). Some background information: ft_read_data reads in a continuous chunk of data, i.e. one trial at a time. Channel selection is either made by the low-level function or, if not possible, all channels are read in and ft_read_data dismisses the undesired channels afterwards. For more information, you can have a look at our wiki-page: http://fieldtrip.fcdonders.nl/reference/ft_read_data After reading in all trials, we put them together and create a raw data-structure: http://fieldtrip.fcdonders.nl/reference/ft_datatype_raw The content of the fields is what we need to extract by using your functions. Additional, we need to be able to read header and events. A header contains these fields: http://fieldtrip.fcdonders.nl/reference/ft_read_header And an event=structure these: http://fieldtrip.fcdonders.nl/reference/ft_read_event On our wiki-page, we have a page on how to import data of various formats: http://fieldtrip.fcdonders.nl/reading_data That page contains information about those dataformats, which need special care or additional information (mostly peculiarities of the datafiles or sensor-arrangement, if the format is associated with a particular one). If we notice any such things during the course of the implementation of your format or if you happen to know things that are different between your format and other formats, we can create a new page and add all that information there. In any case, we should make sure to add the neuroconn-format to http://fieldtrip.fcdonders.nl/dataformat once it all runs smoothly.
Jörn M. Horschig - 2013-10-16 13:54:45 +0200
Hi Falk, I forgot that Robert is in Stockholm the whole week and I am in the US from next week on for a month, so the integration will still take a while. Anyway, I got all necessary files now. I will discuss with Robert when I am back on how to exactly implement the import :) Best, Jörn
Falk Schlegelmilch - 2013-10-16 16:13:15 +0200
Hi Jörn, thank you for your reply. I am also out of office for at least two weeks. Therefore we don't need to hurry. Best wishes, Falk
Jörn M. Horschig - 2013-12-11 16:54:42 +0100
Hi Falk, I just started working on this and then realized that all files are starting with np_. Is this the same format as the neuroprax format? It looks exactly identical. If so, then the support should be already there, see http://code.google.com/p/fieldtrip/source/browse/trunk/fileio/private/np_readdata.m http://code.google.com/p/fieldtrip/source/browse/trunk/fileio/private/np_readmarker.m http://code.google.com/p/fieldtrip/source/browse/trunk/fileio/private/np_readfileinfo.m Best, Jörn
Robert Oostenveld - 2014-01-14 10:25:06 +0100
Hi Thomas, I hope this email reaches you, as I am not sure which of your addresses in bugzilla is the most recent. I recall you having worked with Neurocon in the past for http://sourceforge.net/p/console-kn. Are you able to shed some light on the questions of Jorn in this thread? thanks, Robert
Thomas Hartmann - 2014-01-14 10:29:37 +0100
hi robert, hello everybody, yes, i have worked with neuroconn. however, i had nothing to do with their file format as i read the data directly from the amplifier and saved them after some processing directly in BDF format. best of luck with the endavour! thomas
Robert Oostenveld - 2014-01-14 10:56:38 +0100
(In reply to Thomas Hartmann from comment #8) ah, too bad. Thanks for the quick reply. I'll remove you again from the CC so that you won't be bothered by this.