Handling of automatic data update
⚠ Never ever commit any ID or data from the UTC DB ⚠
UTC Data
UTC data is hosted on an Oracle database.
Step 1
-
Create a new django app in the folder(will be done in Java)interface_UTC
(at the root of the repo). -
The app should connect to the UTC DB and convert the 4 views to JSON arrays, -
This app must advertise one endpoint for each view, -
The app must advertise other endpoints that accept a login as parameter: -
to check what are the sharing permissions associated with a student -
to get all the departures information for a student - So that we don't have to fetch all the data when there is a permission change.
-
-
The app must be documented, consistent (error code) and resilient, -
All endpoints will be readonly.
Step 2
-
Create a mock of this new API so that we can easily use it in a dev environment. -
Don't commit any personal data, -
You can use a local database (of your choice) to hold the dev data, -
All the mock data should be of the form of (commited) SQL scripts, -
You should update the docker-compose
file with the new service(s).
Step 3
-
Create the python scripts to handle the transfer of the data from the API created above into Django models. -
Those scripts should be runnable on demand. -
Put them in a sub-application of the backend
calleddata_exchange
Currency exchange rate
-
Handle the connection to Fixer API to retrieve the exchange rates (directly from data_exchange
) -
Handle the update of the data in the model -
Make it resilient.
Both
-
Investigate the use of Cron (inside the backend docker container) to have an automated daily data update. https://uwsgi-docs.readthedocs.io/en/latest/PythonDecorators.html
Finally
-
Create a backend model / and frontend page, to show when where the latest update / the number of errors; so that anyone can see if the system is healthy at any point in time. -
Also run clearsessions and delete user in the cron 🎉
Edited by Florent Chehab