diff --git a/README.md b/README.md index ef85470..87ddcc1 100644 --- a/README.md +++ b/README.md @@ -1,5 +1,5 @@ -# Brainchop [![Version](https://img.shields.io/badge/Version-2.1.0-brightgreen)]() [![JS ](https://img.shields.io/badge/Types-JavaScript-blue)]() [![MIT-License ](https://img.shields.io/badge/license-MIT-green)](https://github.com/neuroneural/brainchop/blob/master/LICENSE) [![tfjs](https://img.shields.io/badge/tfjs-Pre--trained%20Model-blue)](https://github.com/neuroneural/brainchop/tree/master/models/mnm_tfjs_me_test) [![DOI](https://joss.theoj.org/papers/10.21105/joss.05098/status.svg)](https://doi.org/10.21105/joss.05098) +# Brainchop [![Version](https://img.shields.io/badge/Version-4.0.0-brightgreen)]() [![JS ](https://img.shields.io/badge/Types-JavaScript-blue)]() [![MIT-License ](https://img.shields.io/badge/license-MIT-green)](https://github.com/neuroneural/brainchop/blob/master/LICENSE) [![tfjs](https://img.shields.io/badge/tfjs-Pre--trained%20Model-blue)](https://github.com/neuroneural/brainchop/tree/master/models/mnm_tfjs_me_test) [![DOI](https://joss.theoj.org/papers/10.21105/joss.05098/status.svg)](https://doi.org/10.21105/joss.05098)
@@ -56,7 +56,7 @@ This basic example provides an overview of the training pipeline for the MeshNet ## Live Demo -To see Brainchop in action please click [here](https://neuroneural.github.io/brainchop). +To see Brainchop **v4** in action please click [here](https://neuroneural.github.io/brainchop).
@@ -69,7 +69,7 @@ To see Brainchop in action please click [here](https://neuroneural.github.io/br -**Brainchop v4.0.0 with NiiVue viewer** +**Brainchop v4 with NiiVue viewer**

@@ -78,7 +78,7 @@ To see Brainchop in action please click [here](https://neuroneural.github.io/br -**Brainchop v3.0.0 with more robust models** +**Brainchop v3 with more robust models**
@@ -88,7 +88,7 @@ To see Brainchop in action please click [here](https://neuroneural.github.io/br ![Interface](https://github.com/neuroneural/brainchop/blob/master/css/images/Input3DEnhancements.gif) -**Brainchop v1.4.0 - v3.4.0 rendering MRI Nifti file in 3D** +**Brainchop v1.4.0 - v3.4.0 rendering MRI Nifti file in 3D**
@@ -98,7 +98,7 @@ To see Brainchop in action please click [here](https://neuroneural.github.io/br ![Interface](https://github.com/neuroneural/brainchop/blob/master/css/images/Brainchop3D.gif) -**Brainchop v1.3.0 - v3.4.0 rendering segmentation output in 3D** +**Brainchop v1.3.0 - v3.4.0 rendering segmentation output in 3D** diff --git a/v3/README.md b/v3/README.md index ae7475a..c6cf8d3 100644 --- a/v3/README.md +++ b/v3/README.md @@ -10,7 +10,7 @@ **Frontend For Neuroimaging. Open Source** -**[Demo](https://neuroneural.github.io/brainchop)   [Updates](#Updates)   [Doc](https://github.com/neuroneural/brainchop/wiki/)   [News!](#News)   [Cite](#Citation)** +**[Demo](https://neuroneural.github.io/brainchop/v3)   [Updates](#Updates)   [Doc](https://github.com/neuroneural/brainchop/wiki/)   [News!](#News)   [Cite](#Citation)** @@ -19,7 +19,7 @@

- Brainchop brings automatic 3D MRI volumetric segmentation capability to neuroimaging by running a lightweight deep learning model (e.g., MeshNet) in the web-browser for inference on the user side. + Brainchop v3 brings automatic 3D MRI volumetric segmentation capability to neuroimaging by running a lightweight deep learning model (e.g., MeshNet) in the web-browser for inference on the user side.

@@ -31,7 +31,7 @@

-![Interface](https://github.com/neuroneural/brainchop/blob/master/css/images/brainchop_Arch.png) +![Interface](./css/images/brainchop_Arch.png) **Brainchop high-level architecture**
@@ -39,7 +39,7 @@
-![Interface](https://github.com/neuroneural/brainchop/blob/master/css/images/DL_Arch.png) +![Interface](./css/images/DL_Arch.png) **MeshNet deep learning architecture used for inference with Brainchop** (MeshNet paper)
@@ -56,7 +56,7 @@ This basic example provides an overview of the training pipeline for the MeshNet ## Live Demo -To see Brainchop in action please click [here](https://neuroneural.github.io/brainchop). +To see Brainchop v3.4.0 in action please click [here](https://neuroneural.github.io/brainchop/v3).
@@ -64,7 +64,7 @@ To see Brainchop in action please click [here](https://neuroneural.github.io/br
- + **Brainchop v3.0.0 with more robust models**
@@ -74,7 +74,7 @@ To see Brainchop in action please click [here](https://neuroneural.github.io/br
-![Interface](https://github.com/neuroneural/brainchop/blob/master/css/images/Input3DEnhancements.gif) +![Interface](./css/images/Input3DEnhancements.gif) **Brainchop v1.4.0 rendering MRI Nifti file in 3D**
@@ -83,7 +83,7 @@ To see Brainchop in action please click [here](https://neuroneural.github.io/br
-![Interface](https://github.com/neuroneural/brainchop/blob/master/css/images/Brainchop3D.gif) +![Interface](./css/images/Brainchop3D.gif) **Brainchop v1.3.0 rendering segmentation output in 3D** @@ -98,7 +98,7 @@ To see Brainchop in action please click [here](https://neuroneural.github.io/br * Brainchop [v2.2.0](https://github.com/neuroneural/brainchop/releases/tag/v2.2.0) paper is accepted in the 21st IEEE International Symposium on Biomedical Imaging ([ISBI 2024](https://biomedicalimaging.org/2024/)). Lengthy arXiv version can be found [here](https://arxiv.org/abs/2310.16162).
- +

@@ -107,7 +107,7 @@ To see Brainchop in action please click [here](https://neuroneural.github.io/br * Brainchop [paper](https://doi.org/10.21105/joss.05098) is published in the Journal of Open Source Software (JOSS) on March 28, 2023.
- +

@@ -116,7 +116,7 @@ To see Brainchop in action please click [here](https://neuroneural.github.io/br * Brainchop abstract is accepted for poster presentation during the 2023 [OHBM](https://www.humanbrainmapping.org/) Annual Meeting.
- +

@@ -125,7 +125,7 @@ To see Brainchop in action please click [here](https://neuroneural.github.io/br * Brainchop 1-page abstract and poster is accepted in 20th IEEE International Symposium on Biomedical Imaging ([ISBI 2023](https://2023.biomedicalimaging.org/en/))
- +

@@ -134,7 +134,7 @@ To see Brainchop in action please click [here](https://neuroneural.github.io/br * Google, Tensorflow community spotlight award for brainchop (Sept 2022) on [Linkedin](https://www.linkedin.com/posts/tensorflow-community_github-neuroneuralbrainchop-brainchop-activity-6978796859532181504-cfCW?utm_source=share&utm_medium=member_desktop) and [Twitter](https://twitter.com/TensorFlow/status/1572980019999264774)
- +

@@ -143,7 +143,7 @@ To see Brainchop in action please click [here](https://neuroneural.github.io/br * Brainchop invited to [Pytorch](https://pytorch.org/ecosystem/ptc/2022) flag conference, New Orleans, Louisiana (Dec 2022)
- +
@@ -153,7 +153,7 @@ To see Brainchop in action please click [here](https://neuroneural.github.io/br * Brainchop invited to TensorFlow.js Show & Tell episode #7 (Jul 2022).
- +
## Citation @@ -207,7 +207,7 @@ This work was funded by the NIH grant RF1MH121885. Additional support from NIH R
- + **Mohamed Masoud - Sergey Plis - 2024**
diff --git a/v3/test/README.md b/v3/test/README.md index 511a2da..5d032c2 100644 --- a/v3/test/README.md +++ b/v3/test/README.md @@ -1,19 +1,19 @@ -# Brainchop [![Version](https://img.shields.io/badge/Version-3.4.0-brightgreen)]() [![JS ](https://img.shields.io/badge/Types-JavaScript-blue)]() [![MIT-License ](https://img.shields.io/badge/license-MIT-green)](https://github.com/neuroneural/brainchop/blob/master/LICENSE) [![Pass ](https://img.shields.io/badge/Pass-OK-green)](https://neuroneural.github.io/brainchop/test/runner.html) +# Brainchop [![Version](https://img.shields.io/badge/Version-3.4.0-brightgreen)]() [![JS ](https://img.shields.io/badge/Types-JavaScript-blue)]() [![MIT-License ](https://img.shields.io/badge/license-MIT-green)](https://github.com/neuroneural/brainchop/blob/master/LICENSE) [![Pass ](https://img.shields.io/badge/Pass-OK-green)](./runner.html) ## Unit Testing -**Mocha** unit testing is available in the browser with this [link](https://neuroneural.github.io/brainchop/v3/test/runner.html). +**Mocha** unit testing is available in the browser with this [link](./runner.html).

- + **Mohamed Masoud - Sergey Plis - 2024**