- A chat server send a message to this server via webhook,
- This server should send that message to Dialogflow and it will return the intent of the message
- This server should search for a response in the databse and reply it back to the chat server
- git
- docker
- sending HTTP request to API server with cURL or Postman
- a language and a application server of choice, python and Flask in this case
- a way to expose a local server to public, e.g. ngrok
- basic understanding of ORM (Object-relational mapping), in this case SQLAlchemy. Only having general knowledge of ORM will suffice, we will only use the most basic feature in this experiment
- basic understanding of server to server communication, webhook to be specific.
- Pull this repostory
- Start the server with docker (see commands below)
- Get your user id and save it in a variable
- Retrieve the service account from
users/[your user id]/service_account
. With curl, postman or some other means, and save it in a file (e.g. service_account.json) - Expose the server to public using ngrok
- Register the your webhook endpoint
under the chat server's
users/[your user id]/webhook
with a PUT request, the endpoint expect JSON body with the structure{ "webhook": your web hook here }
- Use
/users/[your user id]/simulate-actions/speaks
to request the chat server to send you a message - Extract the utterence from the message, and use SessionsClient form Dialogflow to resolve the intent
- Return
{ "intent": the name of the intent resolved by dialogflow }
as the response of your webhook, we will treat the value inqueryResult.action
as the intent name - Call
/users/[your user id]/simulate-actions/speaks
again but with{ "should_validate_image_url": true }
as the request body. This time after getting the intent from dialogflow, find the requested snack or drink from the database and return it as a JSON object
- this README
main.py
, the entry point of the serverDockerfile
, for creating the container to run the serverpyproject.toml
andpoetry.lock
, used by poetry for installing dependencies, which is alredy handled in theDockerfile
models.py
, defining the SQLAlchemy modelsdatabse.py
, for setting up the database connection, and configuring the ORMdatabase.sqlite
, storing the response you will need to send back to the chat server. Used by the ORM
# build the image
docker build --tag dialogflow-nlp-experiment .
# run the container (this image was running flask in development mode, it will auto-reload upon file change, manually restarting the container is not necessary)
docker run --rm -v `pwd`:/app -p 5000:5000 dialogflow-nlp-experiment