Skip to content

Commit

Permalink
Merge pull request #5 from AstonAirQuality/refactor
Browse files Browse the repository at this point in the history
Refactored code
  • Loading branch information
Riyad-boop authored Sep 26, 2023
2 parents c78f3d6 + 2d63275 commit d277c0c
Show file tree
Hide file tree
Showing 70 changed files with 1,386 additions and 1,168 deletions.
30 changes: 30 additions & 0 deletions .env template
Original file line number Diff line number Diff line change
@@ -0,0 +1,30 @@
# FILL IN THE FOLLOWING VARIABLES WITH YOUR OWN VALUES
PLUME_EMAIL= YOUR_PLUME_EMAIL
PLUME_PASSWORD= YOUR_PLUME_PASSWORD
JWT_SECRET= YOUR_JWT_SECRET
ZEPHYR_USERNAME = YOUR_ZEPHYR_USERNAME
ZEPHYR_PASSWORD = YOUR_ZEPHYR_PASSWORD
SC_USERNAME = YOUR_SC_USERNAME
SC_PASSWORD = YOUR_SC_PASSWORD
CRON_JOB_TOKEN= YOUR_CRON_JOB_TOKEN

# TEST_DATABASE_URL
DATABASE_URL=postgresql+psycopg2://postgres:password@localhost:5432/air_quality_db

#DEV_DATABASE_URL
DATABASE_URL_DEV=postgresql+psycopg2://postgres:password@db:5432/air_quality_db

DB_USER_TEST=postgres
DB_PASSWORD_TEST=password
DB_NAME_TEST=air_quality_db

PLUME_FIREBASE_API_KEY = AIzaSyA77TeuuxEwGLR3CJV2aQxLYIetMqou5No
PLUME_ORG_NUM = 85

[email protected]
PGADMIN_PASSWORD_TEST=admin

#production env variables
PRODUCTION_MODE = FALSE
FASTAPI_SENTRY_DSN = https://[email protected]/6709221
FASTAPI_SENTRY_SAMPLE_RATE = 1.0
Binary file added .github/resources/AAQAPI.webp
Binary file not shown.
Binary file added .github/resources/crest.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
32 changes: 32 additions & 0 deletions .github/resources/further-Instructions/DatabaseMigrations.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,32 @@
# Database migrations
Running migrations can fail due to the use of spatial fields in the database. This is because the geoalchemy package is not compatible with the alembic package.
To run a migration you must follow the below steps.

## Running a migration
To begin run the docker containers.

To create a new migration use the below command. ```docker-compose exec app alembic revision --autogenerate -m "New Migration"```.
<br>You may use a custom migration name instead of "New Migration"


To run the migration you can use this command. ```docker-compose exec app alembic upgrade head```
<br> **You should always check your migration file before running it**


## Checking a migration file

### Finding the migration file
from the project root directroy go into the alembic/versions
your new migration will be located here.

### Checking the migration file
Errors can occur when inserting or changing columns with spatial fields.

for more information see:
- https://gist.github.com/utek/6163250
- https://geoalchemy-2.readthedocs.io/en/latest/alembic.html

You must follow the below steps if spatial fields exist in the migration file:

- remove the create_index statement for spatial fields in the upgrade() function.
- remove the drop_index statement for spatial fields in the downgrade() function.
12 changes: 12 additions & 0 deletions .github/resources/further-Instructions/Deployment.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,12 @@
# Deployment

## Creating a zip file of the api
- install dependancies with the command ```python deployment/scripts/installDependancies.py```
- zip the project with the command ```python deployment/scripts/zipProject.py```

## Uploading the file to aws
- check the unzipped file size does not exceed 250mb. (zip file should not be larger than 75mb). If it is then consider setting up a lambda image deployment
- login to aws and navigate to s3 bucket
- upload the new app.zip file and copy the url path
- navigate to lambda and click the code tab, select upload from Amazon s3 location and poste the url path
- once the uplaod is complete navigate the api's base url and check if it works correctly.
File renamed without changes.
66 changes: 66 additions & 0 deletions .github/resources/further-Instructions/old-files/oldReadme.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,66 @@
# AirQuality-API
Severless AWS lambda & Fast API project to fetch/scrape data from Aston's air quality sensors into a postgres database.
Manage sensors and query sensor data

API available on: https://rn3rb93aq5.execute-api.eu-west-2.amazonaws.com/prod/ (open)

# Setting up Docker

## Installation (for windows)
install Docker desktop - https://docs.docker.com/desktop/install/windows-install/

## Building containers from docker compose
**Ensure you are in the root project directory and the env file is in the project root directory**
### Building development container stack
run the command ```docker-compose up```
### Building test container stack
run the command ```docker-compose -f docker-compose-testenv.yml -p test up -d```
delete test container ```docker-compose -f docker-compose-testenv.yml -p test down --volumes```
## Exiting Docker
if using docker desktop you can simply click stop running containers for "app"
if using docker was initialised using a CMD then press **Ctrl + C** or press twice to force close **Ctrl + C , Ctrl + C**

# Setting up the python enviornment (Windows setup)
install python 3.9.6 or the latest version https://www.python.org/downloads/
cd into the root project directory

Run the following commands to create a virtual python enviornment for this project, and upgrade pip
- ```python -m venv env```
- ```python.exe -m pip install --upgrade pip```

## Activating the virtual python enviornment
```cd env/scripts && activate && cd..\..```

### If the above command fails then try it separately
- ```cd env/scripts```
- ```activate```
- ```cd..\..```

## Installing project dependancies
```pip install -r requirements.txt```

## Uninstalling project dependancies (forced)
```pip uninstall -y -r requirements.txt```

## Save project dependancies
```pip freeze > requirements.txt```

# Testing
from the project root directory run the commands ```cd app```

Run tests without coverage ```python -m unittest discover -s testing -p test_*.py```

Run tests with coverage ```python -m coverage run -m unittest discover -s testing -p test_*.py```

View coverage report ```python -m coverage report --omit="*/testing*" ```

export coverage in html ```python -m coverage html --omit="*/testing*```

for more information see
- https://www.pythontutorial.net/python-unit-testing/python-unittest-coverage/

## Using test suites and generating HTML reports
amend the code in the HTMLTestRunner package, to do this see the link: https://stackoverflow.com/questions/71858651/attributeerror-htmltestresult-object-has-no-attribute-count-relevant-tb-lev

Then you can just run the TestRunner scripts which are lcoated in the testing/suites directory as python modules

File renamed without changes.
40 changes: 23 additions & 17 deletions .vscode/launch.json
Original file line number Diff line number Diff line change
Expand Up @@ -6,8 +6,14 @@
"name": "Python: Module - app", // top-level package called "app"
"type": "python",
"request": "launch",
"module": "${fileBasenameNoExtension}",
"cwd": "${workspaceFolder}/app"
"module": "uvicorn",
"cwd": "${workspaceFolder}\\app",
"args": [
"main:app",
"--port",
"8000",
"--reload"
],
},
{
"name": "Python: Module - api_wrappers",
Expand Down Expand Up @@ -68,20 +74,20 @@
"console": "integratedTerminal",
"justMyCode": true
},
{
"name": "Docker: Python - Fastapi",
"type": "docker",
"request": "launch",
"preLaunchTask": "docker-run: debug",
"python": {
"pathMappings": [
{
"localRoot": "${workspaceFolder}",
"remoteRoot": "/app"
}
],
"projectType": "fastapi"
}
}
// {
// "name": "Docker: Python - Fastapi",
// "type": "docker",
// "request": "launch",
// "preLaunchTask": "docker-run: debug",
// "python": {
// "pathMappings": [
// {
// "localRoot": "${workspaceFolder}",
// "remoteRoot": "/app"
// }
// ],
// "projectType": "fastapi"
// }
// }
]
}
Loading

0 comments on commit d277c0c

Please sign in to comment.