Skip to content

Commit

Permalink
forest loss wip
Browse files Browse the repository at this point in the history
  • Loading branch information
favyen2 committed Feb 4, 2025
1 parent 8f80a9e commit b8d45af
Show file tree
Hide file tree
Showing 24 changed files with 650 additions and 752 deletions.
328 changes: 0 additions & 328 deletions .github/workflows/deploy_image_on_vm.sh

This file was deleted.

28 changes: 22 additions & 6 deletions .github/workflows/forest_loss_driver_prediction.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -93,14 +93,30 @@ jobs:
sudo rm -rf /opt/ghc
sudo rm -rf /usr/local/share/boost
- name: Authenticate into gcp
uses: "google-github-actions/auth@v2"
with:
credentials_json: ${{ secrets.GCP_VM_DEPLOYER_CREDENTIALS }}
- name: Run integrated pipeline in Beaker job
run: |
docker compose -f docker-compose.yaml run \
test python -m rslp.main \
common
beaker_launcher
--project $RSLP_PROJECT
--workflow $RSLP_WORKFLOW
--extra_args $EXTRA_ARGS
env:
RSLP_PROJECT: forest-loss-driver
RSLP_WORKFLOW: integrated_pipeline
EXTRA_ARGS: |
[
"--pred_pipeline_config",
"rslp/forest_loss_driver/inference/config/forest_loss_driver_predict_pipeline_config.yaml",
"--make_tiles_args.dst_dir",
"s3://satlas-explorer-data/rslearn-public/forest_loss_driver/tiles/latest/",
]
- name: Run Extract Dataset Job on VM and Launch Prediction Job on Beaker
- name:
run: |
export PIPELINE_INFERENCE_CONFIG_PATH="rslp/forest_loss_driver/inference/config/forest_loss_driver_predict_pipeline_config.yaml" && \
export PIPELINE_INFERENCE_CONFIG_PATH= && \
export PRED_PIPELINE_CONFIG_ARG="--pred_pipeline_config $PIPELINE_INFERENCE_CONFIG_PATH" && \
# NOTE: The Index cahce dir will be copied to the VM and mounted as a volume in the docker container
export INDEX_CACHE_DIR=${{ secrets.RSLP_PREFIX }}/datasets/forest_loss_driver/index_cache_dir && \
Expand Down
4 changes: 2 additions & 2 deletions data/forest_loss_driver/config.json
Original file line number Diff line number Diff line change
Expand Up @@ -538,7 +538,7 @@
],
"data_source": {
"duration": "30d",
"index_cache_dir": "${INDEX_CACHE_DIR}/sentinel2_gcp/",
"index_cache_dir": "${INDEX_CACHE_DIR}sentinel2_gcp/",
"max_time_delta": "1d",
"name": "rslearn.data_sources.gcp_public_data.Sentinel2",
"query_config": {
Expand All @@ -554,6 +554,6 @@
},
"tile_store": {
"name": "file",
"root_dir": "${TILE_STORE_ROOT_DIR}/rslearn_amazon_conservation/tiles"
"root_dir": "${TILE_STORE_DIR}rslearn_amazon_conservation/tiles"
}
}
8 changes: 8 additions & 0 deletions rslp/common/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -25,3 +25,11 @@ Then you can launch the worker. To test on one machine:
And to launch 100 workers on Beaker:

python -m rslp.main common launch BEAKER_IMAGE_NAME skylight-proto-1 rslp-job-queue-YOURNAME-sub 100 --gpus 1 --shared_memory 256GiB


Beaker Launcher
---------------

`beaker_launcher.py` launches a Beaker job that runs an rslp workflow. It offers a
range of parameters to customize the job setup, such as which Beaker clusters to target
and application-specific environment variables.
Loading

0 comments on commit b8d45af

Please sign in to comment.