|
| 1 | +# Hosting |
| 2 | + |
| 3 | +This page goes over setting up a local instance of Mol* Volumes and Segmentations server and frontend. As this involves cloning the Mol\*VS and Mol\* repositories, it is necessary to create a GitHub account and install a GitHub client. |
| 4 | + |
| 5 | +# Requirements for local hosting |
| 6 | + |
| 7 | +Recommended technical requirements: |
| 8 | + - 16GB RAM or more |
| 9 | + - 8 cores CPU or more |
| 10 | + - At least 2GB of available storage (can be more, depending on the number of entries in the database and size of input data to be processed) |
| 11 | + - Latest version of modern web-browser (e.g. Chrome) |
| 12 | + |
| 13 | +Other requirements: |
| 14 | + - [Node.js and npm](https://docs.npmjs.com/downloading-and-installing-node-js-and-npm) |
| 15 | + |
| 16 | +# Obtaining the code & setting up the environment |
| 17 | + |
| 18 | +Clone this GitHub repository: |
| 19 | + |
| 20 | +``` |
| 21 | +git clone https://github.com/molstar/molstar-volseg |
| 22 | +cd molstar-volseg |
| 23 | +``` |
| 24 | + |
| 25 | +Create a Conda environment from `environment.yaml` using [Conda](https://conda.io/projects/conda/en/latest/user-guide/install/index.html) |
| 26 | + |
| 27 | +``` |
| 28 | +conda env create -f environment.yaml |
| 29 | +``` |
| 30 | + |
| 31 | + |
| 32 | +or [Mamba](https://mamba.readthedocs.io/en/latest/installation.html) |
| 33 | + |
| 34 | +``` |
| 35 | +mamba env create -f environment.yaml |
| 36 | +``` |
| 37 | + |
| 38 | +All subsequent commands need to be run in the created Conda environment after activating it: |
| 39 | + |
| 40 | +``` |
| 41 | +conda activate cellstar-volume-server |
| 42 | +``` |
| 43 | + |
| 44 | +# Setting up the database |
| 45 | + |
| 46 | +## Supported formats |
| 47 | + |
| 48 | +- [SFF](https://www.ebi.ac.uk/emdb/documentation#seg_model) (`.hff`) for segmentation data |
| 49 | +- CCP4 for volume data (`.map, .mrc, .ccp4 files`) |
| 50 | +- Additional formats (`.am, .mod, .seg, .stl`) are supported thought [SFF-TK](https://github.com/emdb-empiar/sfftk) conversion to SFF (`.hff`) |
| 51 | + |
| 52 | +## Adding an entry to the database |
| 53 | + |
| 54 | +The `preprocessor\main.py` script is used for adding entries to the database with the following arguments: |
| 55 | + |
| 56 | + - `--db_path` - path to folder with database, defaults to `test-data/db` |
| 57 | + - `--single_entry` - path to folder with input files (volume and segmentation) |
| 58 | + - `--entry_id` - entry id (e.g. `emd-1832`) to be used as database folder name for that entry |
| 59 | + - `--source_db` - source database name (e.g. `emdb`) to be used as DB folder name |
| 60 | + - `--source_db_id` - actual source database ID of that entry (will be used to compute metadata) |
| 61 | + - `--source_db_name` - actual source database name (will be used to compute metadata) |
| 62 | + - `--force_volume_dtype` - optional data type of volume data to be used instead of the one used volume file. Not used by default |
| 63 | + - `--quantize_volume_data_dtype_str` - optional data quantization (options are - `u1` or `u2`), less precision, but also requires less storage. Not used by default |
| 64 | + - `--temp_zarr_hierarchy_storage_path` - optional path to directory where temporary files will be stored during the build process. Defaults to `test-data/preprocessor/temp_zarr_hierarchy_storage/${DB_PATH}` |
| 65 | + |
| 66 | + |
| 67 | +Example: |
| 68 | + |
| 69 | +- Create a folder `inputs/emd-1832` |
| 70 | +- Download [MAP](https://ftp.ebi.ac.uk/pub/databases/emdb/structures/EMD-1832/map/emd_1832.map.gz) and extract it to `inputs/emd-1832/emd_1832.map` |
| 71 | +- Download [Segmentation](https://www.ebi.ac.uk/em_static/emdb_sff/18/1832/emd_1832.hff.gz) and extract it to `inputs/emd-1832/emd_1832.hff` |
| 72 | +- Run |
| 73 | + |
| 74 | +``` |
| 75 | +python preprocessor\main.py --db_path my_db --single_entry inputs/emd-1832 --entry_id emd-1832 --source_db emdb --source_db_id emd-1832 --source_db_name emdb |
| 76 | +``` |
| 77 | + |
| 78 | +## Adding segmentation in non-SFF format |
| 79 | + |
| 80 | +There are two options: |
| 81 | +- Use application specific segmentation file format directly (we support internal conversion of segmentation data in `.am, .mod, .seg, .stl` formats to SFF via `sfftk` package) |
| 82 | +- Use https://sfftk.readthedocs.io/en/latest/converting.html on your own, and then use that SFF (`.hff`) file to add an entry to database as described above |
| 83 | + |
| 84 | +Example: |
| 85 | + |
| 86 | +- Create a folder `inputs/emd-9094` |
| 87 | +- Download [`emd_9094.map`](https://ftp.ebi.ac.uk/pub/databases/emdb/structures/EMD-9094/map/emd_9094.map.gz) and extract it to `inputs/emd-9094/emd_9094.map` |
| 88 | +- Use [Segger](https://www.cgl.ucsf.edu/chimera/docs/ContributedSoftware/segger/segment.html) to compute a segmentation and store it in `inputs/emd-9094/emd_9094.seg` |
| 89 | +- Run |
| 90 | + |
| 91 | +``` |
| 92 | +python preprocessor\main.py --db_path my_db --single_entry inputs/emd-9094 --entry_id emd-9094 --source_db emdb --source_db_id emd-9094 --source_db_name emdb |
| 93 | +``` |
| 94 | + |
| 95 | +## Build a Database from a CSV list |
| 96 | + |
| 97 | +From root project directory (`molstar-volseg` by default) run: |
| 98 | + |
| 99 | +``` |
| 100 | +python preprocessor/src/tools/deploy_db/build.py --csv_with_entry_ids test-data/preprocessor/db_building_parameters_custom_entries.csv |
| 101 | +``` |
| 102 | + |
| 103 | +This will build db with 11 EMDB entries, and using default values of all other arguments. |
| 104 | + |
| 105 | +Note that building this example may require 16GB+ RAM. |
| 106 | + |
| 107 | +Supported `build.py` arguments: |
| 108 | + - `--csv_with_entry_ids` - csv file with entry ids and info to preprocess |
| 109 | + - `--raw_input_files_dir` - path to raw input files to preprocess, defaults to `test-data/preprocessor/raw_input_files` |
| 110 | + - `--db_path` - path to db folder, defaults to `test-data/db` |
| 111 | + - `--temp_zarr_hierarchy_storage_path` - path to directory where temporary files will be stored during the build process. Defaults to `test-data/preprocessor/temp_zarr_hierarchy_storage/${DB_PATH}` |
| 112 | + |
| 113 | + |
| 114 | +# Hosting |
| 115 | + |
| 116 | +## Hosting the Mol* Volumes and Segmentations Server |
| 117 | + |
| 118 | +To run the API, specify these environment variables: |
| 119 | + |
| 120 | +- `HOST=0.0.0.0` where to host the server |
| 121 | +- `PORT=9000` what port to run the app on |
| 122 | +- `DB_PATH=my_db` path to the database folder |
| 123 | +- `DEV_MODE=False` if True, runs the server in reload mode |
| 124 | + |
| 125 | +and within the `cellstar-volume-server` conda environment run: |
| 126 | + |
| 127 | +``` |
| 128 | +cd server |
| 129 | +python serve.py |
| 130 | +``` |
| 131 | + |
| 132 | +Examples: |
| 133 | + |
| 134 | +Linux/Mac: |
| 135 | + |
| 136 | +``` |
| 137 | +cd server |
| 138 | +DEV_MODE=True python serve.py |
| 139 | +``` |
| 140 | + |
| 141 | +Windows (cmd): |
| 142 | + |
| 143 | +``` |
| 144 | +cd server |
| 145 | +set DEV_MODE=true |
| 146 | +python serve.py |
| 147 | +``` |
| 148 | + |
| 149 | +## Setting up Mol* Viewer |
| 150 | + |
| 151 | +- To view the data, a [Volumes and Segmentations extension](https://github.com/molstar/molstar/tree/master/src/extensions/volumes-and-segmentations) is available as part of the [Mol* Viewer](https://github.com/molstar/molstar). |
| 152 | +- In order to install the plugin, run the following commands: |
| 153 | +``` |
| 154 | +git clone https://github.com/molstar/molstar.git |
| 155 | +cd molstar |
| 156 | +npm install |
| 157 | +npm run build |
| 158 | +``` |
| 159 | + |
| 160 | + |
| 161 | +- Prepare HTML file: |
| 162 | +```html |
| 163 | +<!DOCTYPE html> |
| 164 | +<html lang="en"> |
| 165 | + <head> |
| 166 | + <meta charset="utf-8" /> |
| 167 | + <meta name="viewport" content="width=device-width, user-scalable=no, minimum-scale=1.0, maximum-scale=1.0"> |
| 168 | + <link rel="icon" href="./favicon.ico" type="image/x-icon"> |
| 169 | + <title>Embedded Volseg</title> |
| 170 | + <style> |
| 171 | + #app { |
| 172 | + position: absolute; |
| 173 | + left: 100px; |
| 174 | + top: 100px; |
| 175 | + width: 800px; |
| 176 | + height: 600px; |
| 177 | + } |
| 178 | + </style> |
| 179 | + <link rel="stylesheet" type="text/css" href="molstar.css" /> |
| 180 | + </head> |
| 181 | + <body> |
| 182 | + <div id="app"></div> |
| 183 | + <script type="text/javascript" src="./molstar.js"></script> |
| 184 | + <script type="text/javascript"> |
| 185 | + molstar.Viewer.create('app', { |
| 186 | + // URL that points to server instance |
| 187 | + volumesAndSegmentationsDefaultServer: 'http://localhost:9000/v2', |
| 188 | + layoutIsExpanded: true, |
| 189 | + layoutShowControls: true, |
| 190 | + layoutShowRemoteState: false, |
| 191 | + layoutShowSequence: true, |
| 192 | + layoutShowLog: false, |
| 193 | + layoutShowLeftPanel: true, |
| 194 | +
|
| 195 | + viewportShowExpand: true, |
| 196 | + viewportShowSelectionMode: false, |
| 197 | + viewportShowAnimation: false, |
| 198 | +
|
| 199 | + pdbProvider: 'rcsb', |
| 200 | + emdbProvider: 'rcsb', |
| 201 | + }) |
| 202 | + </script> |
| 203 | + </body> |
| 204 | +</html> |
| 205 | +``` |
| 206 | + |
| 207 | +- Copy `molstar.js` and `molstar.css` from `build/viewer` directory of your local copy of molstar repository in the same directory where the prepared HTML file is located |
| 208 | + |
| 209 | +- Host Mol* Volumes and Segmentations Server as described above |
| 210 | + |
| 211 | +- Open the prepared HTML file in your web-browser, and view entries as described in [Tutorial](https://molstar.org/viewer-docs/volumes_and_segmentations/how-to/) |
| 212 | + |
| 213 | + |
| 214 | + |
| 215 | +## Internal script for preprocessing database, hosting API and Landing page |
| 216 | + |
| 217 | +We use the [build_and_deploy.py](../preprocessor/src/tools/deploy_db/build_and_deploy.py) script to preprocess the database and host the API and Landing Page. Note that it will not host the Mol* viewer locally. Nevertheless, this script with some modifications might be useful when running the solution on your own data. |
| 218 | + |
| 219 | +To build database, host Landing Page and API, from `molstar-volseg` directory (default) run: |
| 220 | + |
| 221 | +``` |
| 222 | +python preprocessor/src/tools/deploy_db/build_and_deploy.py --csv_with_entry_ids test-data/preprocessor/db_building_parameters_custom_entries.csv |
| 223 | +``` |
| 224 | + |
| 225 | +`preprocessor/src/tools/deploy_db/build_and_deploy.py` arguments: |
| 226 | + |
| 227 | + - `--csv_with_entry_ids` - csv file with entry ids and info to preprocess |
| 228 | + - `--raw_input_files_dir` - path to raw input files to preprocess, defaults to `test-data/preprocessor/raw_input_files` |
| 229 | + - `--db_path` - path to db folder, defaults to `test-data/db` |
| 230 | + - `--temp_zarr_hierarchy_storage_path` - path to directory where temporary files will be stored during the build process. Defaults to `test-data/preprocessor/temp_zarr_hierarchy_storage/${DB_PATH}` |
| 231 | + - `--api_port API_PORT` - api port, defaults to `9000` |
| 232 | + - `--api_hostname` - hostname, defaults to `localhost` |
| 233 | + - `--frontend_port` - frontend port, defaults to `3000` |
| 234 | + |
| 235 | + |
| 236 | +## Troubleshooting |
| 237 | + |
| 238 | +### `ImportError: libEGL.so.1: cannot open shared object file: No such file or directory)` Error on Debian |
| 239 | + |
| 240 | +- `wget https://ftp.gnu.org/pub/gnu/libiconv/libiconv-1.17.tar.gz` |
| 241 | + - Extract, compile and add path to `LD_LIBRARY_PATH` |
| 242 | +- To `/etc/apt/sources.list` add line `deb http://deb.debian.org/debian bullseye main` |
| 243 | +- `apt-get update` |
| 244 | +- `apt-get install libegl-dev` |
0 commit comments