-
Notifications
You must be signed in to change notification settings - Fork 8
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
ENH: Add support for (M/EEG) FIFF file format? #4
Comments
@JWinawer says: |
Looks like they use Magnes 3600 (4D) data -- reading and processing Magnes 3600 MEG data (as well as many other data formats) is fully supported by mne-python, so we should be good there. FWIW I think there is already pretty good format coverage in mne-python, but if any formats are missing it is pretty easy to add support for them, too. |
@Eric89GXL added a 'meeg' branch to help get this started. I think the first thing we should consider is how to get sorting information. |
(Copied this issue from the old repo, where I errantly posted it.)
I haven't used NIMS yet, but my group is considering adopting NIMS for (at least) MRI data organization. In the coming months we plan on trying to set up NIMS and see if it suits our data management needs.
In our group we also do M/EEG studies. It seems like being able to use NIMS as a front-end for MEG data organization would be really cool. I talked with @jyeatman about the possibility today, and he said that you folks might be receptive to this idea. He told me I'd need a way to scrape the header information from MEG files. Assuming mne-python (I am a dev of the package) is installed this is a two-liner, so I don't think it would be too difficult:
info_dict
then has necessary information to identify the scan, including the subject name, scan date, etc. Assuming this worked, it would also make a tight integration of mne-python pipelines with NIMS possible (cc @agramfort).Let me know if this is something you would potentially be interested in having as part of NIMS!
The text was updated successfully, but these errors were encountered: