LocalPrompt is an AI-powered tool designed to refine and optimize AI prompts, helping users generate high-quality, structured prompts that work effectively with advanced AI models.
This tool is built to work locally with Mistral 7B, ensuring privacy and efficiency without relying on external cloud services. It is ideal for AI prompt engineers, machine learning developers, and researchers who want to run AI models offline.
- 🔹 Privacy & Security - No external API calls, all processing happens on your own machine
- 🔹 Cost-Effective - Avoid expensive API costs by running AI models completely offline
- 🔹 Customization - Fine-tune the model and modify it for specific use cases
- 🔹 Performance - Run low-latency AI models optimized for your local hardware (CPU/GPU)
- ✅ Refines AI Prompts - Converts rough ideas into high-quality, structured prompts
- ✅ Uses Mistral 7B - Locally hosted Mistral-7B-Instruct for privacy and efficiency
- ✅ Customizable - Supports different prompt optimizations and fine-tuning
- ✅ FastAPI Backend - Simple API with easy-to-use REST endpoints
LocalPrompt/
│️—— app/
│ ├—— api/ # API Routes
│ │ ├—— prompt.py # Prompt Engineering API
│ ├—— config/ # Core Configurations
│ │ ├—— settings.py # Environment & Settings
│ ├—— services/ # Business Logic
│ │ ├—— prompt_service.py # Prompt Processing Logic
│ └—— main.py # FastAPI Entry Point
│️—— models/ # AI Model Storage (Ignored in .gitignore)
│️—— .env # Environment Variables
│️—— .gitignore # Git Ignore File
│️—— requirements.txt # Python Dependencies
│️—— README.md # Project Documentation
git clone https://github.com/sawadkk/LocalPrompt.git
cd LocalPrompt
python -m venv venv
source venv/bin/activate # On macOS/Linux
venv\Scripts\activate # On Windows
pip install -r requirements.txt
mkdir models
curl -L -o models/mistral-7b-instruct-v0.1.Q4_K_M.gguf https://huggingface.co/TheBloke/Mistral-7B-v0.1-GGUF/resolve/main/mistral-7b-v0.1.Q4_K_M.gguf?download=true
uvicorn app.main:app --reload
- URL:
POST /api/generate_prompt/
- Request:
{
"idea": "Create a sci-fi world with AI robots.",
"max_tokens": 100,
"temperature": 0.7,
"top_k": 40,
"top_p": 0.9
}
- Response:
{
"original_idea": "Create a sci-fi world with AI robots.",
"refined_prompt": "Design a detailed sci-fi world featuring advanced AI-driven civilizations...",
"temperature": 0.7,
"top_k": 40,
"top_p": 0.9
}
- 🔹 Fine-tuning - Train Mistral 7B for more precise prompt refinement
- 🔹 Web UI - Add a frontend interface for prompt generation
- 🔹 Multi-model Support - Integrate with OpenAI, LLaMA, DeepSeek for comparisons
- Fork the repository
- Create a new branch:
git checkout -b feature-name
- Commit changes:
git commit -m "Added new feature"
- Push to branch:
git push origin feature-name
- Open a Pull Request 🎉