Skip to content

Commit

Permalink
Merge from gitlab
Browse files Browse the repository at this point in the history
  • Loading branch information
georgehgfonseca committed Jan 4, 2023
1 parent 81feed7 commit 71ebe04
Show file tree
Hide file tree
Showing 201 changed files with 7,055 additions and 163 deletions.
5 changes: 3 additions & 2 deletions .gitignore
100644 → 100755
Original file line number Diff line number Diff line change
Expand Up @@ -133,8 +133,9 @@ dmypy.json

/images/*
/app/sql_app.db
/f05_backend*
/f05_backend.env
/database.env
/reports/
.history/
.vscode/
.vscode/
.vercel
2 changes: 2 additions & 0 deletions Dockerfile
100644 → 100755
Original file line number Diff line number Diff line change
Expand Up @@ -8,6 +8,8 @@ RUN pip install -r requirements.txt

RUN mkdir -p ./images

RUN mkdir -p ./reports

WORKDIR /f05_backend/app

RUN pip install -e ./src
Expand Down
Empty file modified LICENSE
100644 → 100755
Empty file.
72 changes: 44 additions & 28 deletions README.md
100644 → 100755
Original file line number Diff line number Diff line change
Expand Up @@ -2,13 +2,13 @@

This project was made using the framework [FastApi](https://fastapi.tiangolo.com/)
and the main goal it's to provide a backend for the developed mobile project.

## 1. Setup

To run this project first it's necessary to install all dependencies listed in the files
**requirements.txt**. It's advisable that the setup it's made using any kind of
**requirements.txt**. It's advisable that the setup it's made using any kind of
virtual environment.

```bash
pip install -r requirements.txt
```
Expand All @@ -22,15 +22,15 @@ pip install -e .

### 1.1. Environment Variables

This project it's configured to use .env files to set main environment variables. For this project to
This project it's configured to use .env files to set main environment variables. For this project to
fully work it's necessary to create a **f05_backend.env** file inside the root folder. This file should contain:

```bash
ENVIRONMENT="development"
SENTRY_KEY=<SENTRY_DNS_ADDRESS>
IMAGE_FOLDER="../images/"
DATABASE_URL="sqlite:///./sql_app.db"
API_PREFIX=""
API_PREFIX=""
TOKEN_CEP_ABERTO=<TOKEN_CEP_ABERTO>
SECRET_KEY=<SECRET_KEY_TO_GENERATE_TOKEN>
API_KEY=<GENERATED_API_KEY>
Expand All @@ -55,7 +55,7 @@ In order to fill this API Key a strong password generator with 20 digits result

### 1.2. Database

The project it's configured to use SQLAlchemy as framework to make the connection with database. For the development of this
The project it's configured to use SQLAlchemy as framework to make the connection with database. For the development of this
project was used **SQLite** database, and the deployment made with PostgreSQL using docker-compose configuration.

The environment variable **DATABASE_URL** on the file created in section 1.1 it's used to tell the service to what database connect.
Expand Down Expand Up @@ -106,7 +106,7 @@ services:
volumes:
database-data: # named volumes can be managed easier using docker-compose

networks:
networks:
postgres-compose-network:
driver: bridge
```
Expand All @@ -115,9 +115,9 @@ After everything it's set, go to the file **f05_backend_env** and change the **D

Using the values of this example we will have:

````bash
```bash
DATABASE_URL = "postgresql://mp_user:magical_password_example@mp_f05_database/db"
````
```

With this we have a connection onto the database

Expand All @@ -129,7 +129,7 @@ look like.
Mongo DB it's a database used for the queue process of the collects. All collects are first saved in the database and just after
being evaluated they are saved into Postgree and can be used in the app.

First step it's to install an instance of MongoDb on the machine or in the docker environment. To install using docker compose
First step it's to install an instance of MongoDb on the machine or in the docker environment. To install using docker compose
add the following lines into your docker-compose file:

```bash
Expand All @@ -147,18 +147,18 @@ services:
It's possible to run the project using docker doing the deployment with docker-compose, building locally or
getting the docker image from github package repository.

To build the image locally from source code run:
To build the image locally from source code run:

```bash
docker build -t f05_backend .
docker build -t f05_backend .
```

Add the following lines to your compose that has database values:

```bash
f05_backend:
#Uncoment the two following lines and comment image if you want to build from local project
#build: ./
#build: ./
#container_name: "f05_backend"
image: docker.pkg.github.com/mpmg-dcc-ufmg/f05-mobile-backend/f05-backend-image:v1.1.2
labels:
Expand Down Expand Up @@ -189,13 +189,13 @@ After the image it's built the container can be deployed with the command:
docker-compose up -d
```

To check if the container was successfully deployed open the browser in the
address ***http://localhost/f05_backend/docs/***
To check if the container was successfully deployed open the browser in the
address **_http://localhost/f05_backend/docs/_**

If it's necessary to enter the shell to run alembic migrations or run scripts execute:

```bash
docker-compose exec f05_backend sh
docker-compose exec f05_backend sh
```

Container's logs can be seen using:
Expand All @@ -206,11 +206,11 @@ docker-compose logs f05_backend

### 1.4 Third Party

Some third party solutions are used to deliver features:
Some third party solutions are used to deliver features:

* [CEP Aberto](https://cepaberto.com/) : It is a project that aims to provide free access and collaboratively
build a database with the geolocalized Postal Address Codes (CEP) from all over Brazil. **It's necessary to
get an API Key from this service in order to use the get address by CEP feature**
- [CEP Aberto](https://cepaberto.com/) : It is a project that aims to provide free access and collaboratively
build a database with the geolocalized Postal Address Codes (CEP) from all over Brazil. **It's necessary to
get an API Key from this service in order to use the get address by CEP feature**

### 1.3.1 Useful commands

Expand All @@ -220,7 +220,7 @@ get an API Key from this service in order to use the get address by CEP feature*
docker system prune -a
```

2. Remove all images
2. Remove all images

```bash
docker rmi $(docker images -a -q)
Expand All @@ -239,19 +239,35 @@ docker rm $(docker ps -a -q)
docker volume prune
```

## 2. Documentation
## 2. Creating mock data (optional)

After starting the API, you can load it with mock data by running two commands:

1. Run Python script to populate the database based on .csv files:

```bash
cd initial_data && python3 add_to_server.py
```

2. Copy initial_data/photos/images to images folder:

```bash
cd .. && cp initial_data/photos/images/* ./images
```

## 3. Documentation

FastAPI offers an automatic documentation of endpoints already implemented using
FastAPI offers an automatic documentation of endpoints already implemented using
Swagger or ReDoc. You can acess this documentation going to:

* Swagger: http://0.0.0.0:8000/docs
* ReDoc: http://0.0.0.0:8000/redoc
- Swagger: http://0.0.0.0:8000/docs
- ReDoc: http://0.0.0.0:8000/redoc

## 3. Sponsors
## 4. Sponsors

<h1 align="center">
<a href="https://www.mpmg.mp.br/"><img src="./assets/mmpg_logo.png" alt="MPMG"></a>
<br>
<a href="https://sentry.io/"><img src="./assets/sentry-logo-black.png" alt="Sentry"></a>
<br>
</h1>
</h1>
Empty file modified app/__init__.py
100644 → 100755
Empty file.
Empty file modified app/alembic.ini
100644 → 100755
Empty file.
Empty file modified app/alembic/README.md
100644 → 100755
Empty file.
Empty file modified app/alembic/env.py
100644 → 100755
Empty file.
Empty file modified app/alembic/script.py.mako
100644 → 100755
Empty file.
Empty file modified app/main.py
100644 → 100755
Empty file.
Empty file modified app/src/__init__.py
100644 → 100755
Empty file.
Empty file modified app/src/application/__init__.py
100644 → 100755
Empty file.
Empty file modified app/src/application/address/__init__.py
100644 → 100755
Empty file.
Empty file modified app/src/application/address/database/__init__.py
100644 → 100755
Empty file.
7 changes: 4 additions & 3 deletions app/src/application/address/database/addressDB.py
100644 → 100755
Original file line number Diff line number Diff line change
Expand Up @@ -2,6 +2,7 @@

from application.core.database import Base
from application.core.helpers import generate_uuid, is_valid_uuid
from sqlalchemy.orm import relationship

from application.address.models.address import Address

Expand All @@ -18,7 +19,9 @@ class AddressDB(Base):
city = Column(String)
state = Column(String, default="MG")
cep = Column(String)
public_work_id = Column(String)

publicworks = relationship("PublicWorkDB", cascade="all,delete-orphan", backref="publicwork")


@classmethod
def from_model(cls, address: Address):
Expand All @@ -31,7 +34,6 @@ def from_model(cls, address: Address):
city=address.city,
state=address.state,
cep=address.cep,
public_work_id=address.public_work_id
)

if address.id and is_valid_uuid(address.id):
Expand All @@ -49,4 +51,3 @@ def update(self, address: Address):
self.longitude = address.longitude
self.city = address.city
self.cep = address.cep
self.public_work_id = address.public_work_id
Empty file modified app/src/application/address/database/cityDB.py
100644 → 100755
Empty file.
Empty file modified app/src/application/address/database/repository.py
100644 → 100755
Empty file.
Empty file modified app/src/application/address/models/__init__.py
100644 → 100755
Empty file.
1 change: 0 additions & 1 deletion app/src/application/address/models/address.py
100644 → 100755
Original file line number Diff line number Diff line change
Expand Up @@ -11,7 +11,6 @@ class Address(BaseModel):
city: str
state: str = "MG"
cep: str
public_work_id: str = None

class Config:
orm_mode = True
Empty file modified app/src/application/address/models/city.py
100644 → 100755
Empty file.
3 changes: 1 addition & 2 deletions app/src/application/address/routes.py
100644 → 100755
Original file line number Diff line number Diff line change
Expand Up @@ -47,7 +47,6 @@ async def get_address_by_cep(cep: str) -> Address:
city=parsed_response["cidade"]["nome"],
state="MG",
cep=cep,
public_work_id=None
)
return address
except Exception:
Expand All @@ -69,7 +68,7 @@ async def get_all_cities(db: Session = Depends(get_db)) -> List[City]:
return repository.get_all_cities(db)

@staticmethod
@address_router.post("/city/delete")
@address_router.delete("/city/delete")
async def delete_city(codigo_ibge: str, db: Session = Depends(get_db)) -> Response:
city_db = repository.delete_city(db, codigo_ibge)
if city_db:
Expand Down
Empty file modified app/src/application/associations/__init__.py
100644 → 100755
Empty file.
Empty file modified app/src/application/associations/database/__init__.py
100644 → 100755
Empty file.
Empty file modified app/src/application/associations/database/association_tp_tw.py
100644 → 100755
Empty file.
Empty file modified app/src/application/associations/database/association_tw_ws.py
100644 → 100755
Empty file.
Empty file modified app/src/application/associations/database/repository.py
100644 → 100755
Empty file.
Empty file modified app/src/application/associations/models/__init__.py
100644 → 100755
Empty file.
Empty file modified app/src/application/associations/models/association.py
100644 → 100755
Empty file.
Empty file modified app/src/application/associations/routes.py
100644 → 100755
Empty file.
Empty file modified app/src/application/calendar/__init__.py
100644 → 100755
Empty file.
Empty file modified app/src/application/calendar/calendar_utils.py
100644 → 100755
Empty file.
Empty file modified app/src/application/collect/__init__.py
100644 → 100755
Empty file.
Empty file modified app/src/application/collect/database/__init__.py
100644 → 100755
Empty file.
19 changes: 14 additions & 5 deletions app/src/application/collect/database/collectDB.py
100644 → 100755
Original file line number Diff line number Diff line change
@@ -1,6 +1,7 @@
from sqlalchemy import Column, BigInteger, String, ForeignKey, Integer
from application.calendar.calendar_utils import get_today
from sqlalchemy import Column, BigInteger, String, ForeignKey, Integer, Boolean

from sqlalchemy.orm import relationship
from sqlalchemy.orm import backref, relationship

from application.core.database import Base
from application.core.helpers import generate_uuid, is_valid_uuid
Expand All @@ -14,11 +15,13 @@ class CollectDB(Base):
id = Column(String, primary_key=True, index=True, default=generate_uuid)
date = Column(BigInteger)
comments = Column(String)
user_email = Column(String)
public_work_status = Column(Integer)
queue_status = Column(Integer, default=0)
queue_status_date = Column(BigInteger, default=get_today())
inspection_flag = Column(String, nullable=True)

user_email = Column(String, ForeignKey("user.email"), nullable=True)
public_work_id = Column(String, ForeignKey("publicwork.id"))
inspection_flag = Column(String, ForeignKey("inspection.flag"), nullable=True)

photos = relationship("PhotoDB", cascade="all,delete-orphan", backref="photo")

Expand All @@ -30,7 +33,9 @@ def from_model(cls, collect: Collect):
public_work_id=collect.public_work_id,
inspection_flag=collect.inspection_flag if collect.inspection_flag != "" else None,
user_email=collect.user_email,
public_work_status=collect.public_work_status
public_work_status=collect.public_work_status,
queue_status=collect.queue_status,
queue_status_date=collect.queue_status_date
)

if collect.id and is_valid_uuid(collect.id):
Expand All @@ -47,6 +52,8 @@ def parse_to_collect(self):
user_email=self.user_email,
comments=self.comments,
public_work_status=self.public_work_status,
queue_status=self.queue_status,
queue_status_date=self.queue_status_date,
photos=self.photos
)

Expand All @@ -58,3 +65,5 @@ def update(self, collect: Collect):
self.user_email = collect.user_email
self.date = collect.date
self.public_work_status = collect.public_work_status
self.queue_status=collect.queue_status
self.queue_status_date=collect.queue_status_date
32 changes: 23 additions & 9 deletions app/src/application/collect/database/repository.py
100644 → 100755
Original file line number Diff line number Diff line change
@@ -1,22 +1,31 @@
from pathlib import Path
from typing import List, Optional

from application.core.helpers import paginate
from application.core.models.pagination import Pagination
from sqlalchemy import desc
from sqlalchemy.orm import Session

import application.publicwork.database.repository as public_work_repository
from application.calendar.calendar_utils import get_first_day_of_month
from application.collect.database.collectDB import CollectDB
from application.collect.models.collect import Collect
from application.collect.models.collect_report import CollectReport

from application.calendar.calendar_utils import get_first_day_of_month
from application.core.helpers import paginate
from application.core.models.pagination import Pagination
from application.file.file_utils import create_json_file_from_list

import application.publicwork.database.repository as public_work_repository


def get_all_collect(db: Session) -> List[Collect]:
return db.query(CollectDB).all()
return db.query(CollectDB).order_by(desc("queue_status_date")).all()

def get_all_citizen_collects(db: Session) -> List[Collect]:
is_citizen_collect = CollectDB.inspection_flag.is_(None)
is_not_pending = CollectDB.queue_status.is_distinct_from(0)
is_not_deleted = CollectDB.queue_status.is_distinct_from(3)

return db.query(CollectDB).order_by(desc("queue_status_date")).filter(is_citizen_collect, is_not_pending, is_not_deleted).all()

def get_citizen_collects_queue(db: Session) -> List[Collect]:
return db.query(CollectDB).order_by(desc("queue_status_date")).filter(CollectDB.inspection_flag.is_(None), CollectDB.queue_status.is_(0)).all()


def get_all_collect_paginated(db: Session, page: int, per_page: int = 20) -> Optional[Pagination]:
Expand All @@ -28,11 +37,16 @@ def get_collect_by_id(db: Session, collect_id: str) -> Collect:


def get_public_work_collects(db: Session, public_work_id: str) -> List[Collect]:
return db.query(CollectDB).filter(CollectDB.public_work_id == public_work_id).all()
return db.query(CollectDB).order_by(desc("queue_status_date")).filter(CollectDB.public_work_id == public_work_id).all()

def get_public_work_citizen_collects(db: Session, public_work_id: str) -> List[Collect]:
is_citizen_collect = CollectDB.inspection_flag.is_(None)
is_available = CollectDB.queue_status.is_(0)
return db.query(CollectDB).order_by(desc("queue_status_date")).filter(CollectDB.public_work_id == public_work_id, is_citizen_collect, is_available).all()


def get_inspection_collects(db: Session, inspection_flag: str) -> List[Collect]:
return db.query(CollectDB).filter(CollectDB.inspection_flag == inspection_flag).all()
return db.query(CollectDB).order_by(desc("queue_status_date")).filter(CollectDB.inspection_flag == inspection_flag).all()


def add_collect(db: Session, collect: Collect) -> Collect:
Expand Down
Empty file modified app/src/application/collect/models/__init__.py
100644 → 100755
Empty file.
9 changes: 6 additions & 3 deletions app/src/application/collect/models/collect.py
100644 → 100755
Original file line number Diff line number Diff line change
@@ -1,13 +1,16 @@
from typing import List
from typing import List, Optional

from application.photo.models.photo import Photo
from pydantic import BaseModel

from application.photo.models.photo import Photo


class Collect(BaseModel):
id: str = None
public_work_id: str
inspection_flag: str
inspection_flag: Optional[str]
queue_status: int = 0
queue_status_date: int = None
date: int
user_email: str
comments: str = None
Expand Down
Empty file modified app/src/application/collect/models/collect_report.py
100644 → 100755
Empty file.
Loading

0 comments on commit 71ebe04

Please sign in to comment.