Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Model Self Serve #29

Open
ChromaticPanic opened this issue Nov 14, 2024 · 0 comments
Open

Model Self Serve #29

ChromaticPanic opened this issue Nov 14, 2024 · 0 comments
Assignees
Labels
devops for deployment enhancement New feature or request help wanted Extra attention is needed

Comments

@ChromaticPanic
Copy link
Contributor

ChromaticPanic commented Nov 14, 2024

Description
As a DevOps I would like to be able to serve the models in a more flexible way instead of being vendor locked with AzureML

Expected behavior
Describe what you expected to happen.

Frameworks Considered
Flask - AzureML endpoints serve models using Flask
Torchserve - GCP Vertex and AWS Sagemaker uses Torchserve
LitServe - Pytorch Lightning project based on FastApi with optimizations for ML model serving

@ChromaticPanic ChromaticPanic added the bug Something isn't working label Nov 14, 2024
@ChromaticPanic ChromaticPanic added devops for deployment enhancement New feature or request and removed bug Something isn't working labels Nov 14, 2024
@ChromaticPanic ChromaticPanic added the help wanted Extra attention is needed label Nov 14, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
devops for deployment enhancement New feature or request help wanted Extra attention is needed
Projects
None yet
Development

No branches or pull requests

2 participants