- π Overview
- πΎ Features
- π Project Structure
- π Getting Started
- π Project Roadmap
- π° Contributing
- π License
- π Acknowledgments
The E-Commerce-Support-Agent-RAG project provides a comprehensive understanding of the working behind a Customer Service Agent for online retailers by utilizing advanced AI to provide dynamic, personalized support by implementing Adaptive RAG.
Adaptive RAG is a method that chooses the best strategy for answering questions, from a direct LLM response to single or multiple retrieval steps. This selection is based on the queryβs complexity, as determined by a classifier. The 2 strategies for answering are:
- Single-Step Retrieval: For moderately complex questions, it retrieves information from a single external source, ensuring the answer is both swift and well-informed.
- Multi-Step Retrieval: For highly complex questions, it consults multiple sources, piecing together a detailed and comprehensive answer.
This system efficiently processes and responds to customer inquiries, ensuring a seamless QA experience. Ideal for e-commerce platforms seeking to enhance customer interaction and satisfaction, it leverages machine learning to handle queries with precision and speed.
Feature | Summary | |
---|---|---|
βοΈ | Architecture |
|
π© | Code Quality |
|
π | Documentation |
|
π | Integrations |
|
𧩠| Modularity |
|
β‘οΈ | Performance |
|
π‘οΈ | Security |
|
βββ E-Commerce-Support-Agent-RAG/
βββ LICENSE
βββ README.md
βββ agent
β βββ support_agent.py
β βββ utils.py
βββ app.py
βββ chainlit.md
βββ config
β βββ settings.py
βββ data
β βββ knowledge_base
β βββ loaders
β βββ processors
βββ llm
β βββ litellm_service.py
β βββ prompt_templates.py
βββ models
β βββ schema.py
βββ rag
β βββ embedding_service.py
β βββ vector_store.py
βββ requirements.txt
E-COMMERCE-SUPPORT-AGENT-RAG/
__root__
app.py - App.py serves as the entry point for a customer support chat application, initializing the support agent and handling user queries
- Upon starting a chat, it loads and indexes necessary data, displays initialization status, and sends a welcome message
- It processes incoming messages and generates responses using a support agent, ensuring a dynamic and interactive user experience.requirements.txt - Requirements.txt specifies the necessary Python packages for the project, ensuring consistent environments across different setups
- It includes libraries for AI operations, database interactions, environment variable management, numerical computations, and testing
- This setup is crucial for maintaining project functionality and compatibility, facilitating development and deployment processes.
rag
vector_store.py - ChromaVectorStore serves as a wrapper for ChromaDB, facilitating the storage of pre-generated embeddings in a specified collection using an ephemeral client
- It supports adding embeddings along with optional texts, metadata, and custom IDs, generating UUIDs for documents when IDs are not provided
- This component enhances the project's capability to manage and index large volumes of embedding data efficiently.embedding_service.py - EmbeddingService in `rag/embedding_service.py` manages the generation of text embeddings using the LiteLLM library, configured with specific API keys and model settings
- It supports batch processing for multiple texts and handles individual queries, providing a scalable solution for embedding generation within the project's architecture.
agent
support_agent.py - SupportAgent in `agent/support_agent.py` orchestrates customer interactions for an e-commerce platform by leveraging machine learning models to process and respond to user queries
- It initializes services for data loading, text processing, and embedding generation, and handles query classification, context generation, and personalized customer responses based on the nature of the inquiry.utils.py - Agent/utils.py serves as a utility module within the broader codebase, primarily handling the parsing of responses from language models and JSON files into structured Pydantic models
- It facilitates the extraction and transformation of data into a format that supports further processing and integration within the application's architecture.
config
settings.py - Config/settings.py establishes the environment for accessing API keys and configuring models within the software architecture
- It initializes settings for embedding and language models, defines parameters for data retrieval, and sets agent behavior controls, ensuring the application interacts effectively with external AI services and manages data processing efficiently.
llm
prompt_templates.py - The `prompt_templates.py` in the `llm` directory defines structured templates for handling customer support queries, specifically focusing on extracting and responding to order-related information
- It categorizes queries into general knowledge requests and specific customer order data, ensuring responses are tailored and relevant based on the explicit details requested by the customer.litellm_service.py - LiteLLMService in llm/litellm_service.py serves as the interface for generating responses using a language model
- It initializes with API and model settings, and offers a method to produce text outputs based on user prompts and optional system messages, handling errors gracefully during the process.
models
schema.py - Defines data models essential for managing product and order information within the system
- Models such as Product, OrderDetails, and CustomerDetails facilitate structured data storage and retrieval for order processing and customer management, enhancing the system's ability to handle e-commerce transactions efficiently.
Install E-Commerce-Support-Agent-RAG using one of the following methods:
Build from source:
- Clone the E-Commerce-Support-Agent-RAG repository:
β― git clone https://github.com/HopMaster03/E-Commerce-Support-Agent-RAG
- Navigate to the project directory:
β― cd E-Commerce-Support-Agent-RAG
- Install the project dependencies:
β― pip install -r requirements.txt
- Setup
.env
file
β― OPENAI_API_KEY = "your API key here"
Run E-Commerce-Support-Agent-RAG using the following command:
Using pip
Β
β― chainlit run app.py -w
-
Task 1
:Adaptive RAG -
Task 2
: Enabling conversation history & caching. -
Task 3
: Metadata Filtering.
- π¬ Join the Discussions: Share your insights, provide feedback, or ask questions.
- π Report Issues: Submit bugs found or log feature requests for the
E-Commerce-Support-Agent-RAG
project. - π‘ Submit Pull Requests: Review open PRs, and submit your own PRs.
Contributing Guidelines
- Fork the Repository: Start by forking the project repository to your github account.
- Clone Locally: Clone the forked repository to your local machine using a git client.
git clone https://github.com/HopMaster03/E-Commerce-Support-Agent-RAG
- Create a New Branch: Always work on a new branch, giving it a descriptive name.
git checkout -b new-feature-x
- Make Your Changes: Develop and test your changes locally.
- Commit Your Changes: Commit with a clear message describing your updates.
git commit -m 'Implemented new feature x.'
- Push to github: Push the changes to your forked repository.
git push origin new-feature-x
- Submit a Pull Request: Create a PR against the original project repository. Clearly describe the changes and their motivations.
- Review: Once your PR is reviewed and approved, it will be merged into the main branch. Congratulations on your contribution!
This project is protected under the MIT-LICENSE License. For more details, refer to the LICENSE file.
- List any resources, contributors, inspiration, etc. here.