Skip to content

Commit

Permalink
polish readme for demo
Browse files Browse the repository at this point in the history
  • Loading branch information
goldmermaid committed Nov 20, 2023
1 parent 4bf29f1 commit 92d4fca
Show file tree
Hide file tree
Showing 2 changed files with 9 additions and 1 deletion.
8 changes: 7 additions & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -136,6 +136,11 @@ pip3 uninstall -y torch
pip3 install --pre torch --index-url https://download.pytorch.org/whl/nightly/cu121 # cu121 means cuda 12.1
```

Now you are done with installing, try to modify one of the examples and run the below command!

```
python -m example.retrieval_qa.retrieval_qa_huggingface_demo
```

### Frontend Dev Setup
```
Expand All @@ -148,7 +153,8 @@ npm run build
If you are on EC2, you can launch a GPU instance with the following config:
- EC2 `g5.2xlarge` (if you want to run a pretrained LLM with 7B parameters)
- Deep Learning AMI PyTorch GPU 2.0.1 (Ubuntu 20.04)
<img src="example/image/readme_ec2_ami.jpg" alt="Alt text" width="50%" height="50%"/>
<img src="example/image/readme_ec2_ami.jpg" alt="Alt text" width="75%" height="75%"/>
- EBS: at least 100G

<img src="example/image/readme_ec2_storage.png" alt="Alt text" width="50%" height="50%"/>

2 changes: 2 additions & 0 deletions docker/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -125,6 +125,8 @@ For example, here is a command to run `cambioml\pykoi` version `0.1_ec2_linux`.
docker run -d -e RETRIEVAL_MODEL=mistralai/Mistral-7B-v0.1 -p 5000:5000 --gpus all --name pykoi_test cambioml/pykoi:0.1_ec2_linux
```

***Note: this command may take a few minutes*** since it's loading a LLM.

If you are running it in the background, with a `-d` tag, you can check the logs using the following command:
```
docker logs [CONTAINER_NAME]
Expand Down

0 comments on commit 92d4fca

Please sign in to comment.