-
Hi! I'm thinking about using dandiarchive.org to deposit data generated using multiplexed multi-photon calcium imaging methods. My methods produce raw data sizes of about 700GB per hour of recording, and I would probably want to deposit on the order of 100 hours of recordings per year – but that's really just a wild guess. I would use NWB 2.x format. Is dandiarchive suitable for this, and does it allow / encourage data deposition on this scale? I couldn't find any relevant quota specs anywhere. I'd appreciate any hints. Thanks! |
Beta Was this translation helpful? Give feedback.
Replies: 2 comments 3 replies
-
@nobias - the scale would be fine. here are the policies: https://www.dandiarchive.org/handbook/about/policies/ (and you have done the right thing in reaching out). when you say 700GB/recording, is that with or without compression? |
Beta Was this translation helpful? Give feedback.
-
Thank you for your quick replies, that's great to hear. The 700GB/hour are without compression and in the form of naive int16 TIFF files (as written by our go-to microscope control software, Vidrio ScanImage). I'm sure that lossless compression would pay off a good bit, although our data is much less redundant than typical high-resolution two-photon data as produced by current commercial systems. That's because we typically sample a neuron with only one or few pixels, and the raw data will be rather noisy. So, let me discuss this with my head-of-lab; would be great if we can make this happen! Thank you for your offer to help, that is of course greatly appreciated. |
Beta Was this translation helpful? Give feedback.
@nobias - the scale would be fine. here are the policies: https://www.dandiarchive.org/handbook/about/policies/ (and you have done the right thing in reaching out).
when you say 700GB/recording, is that with or without compression?