Development¶
Grand-challenge is distributed as a set of containers that are defined and linked together in docker-compose.yml
.
To develop the platform you need to have docker and docker compose running on your system.
Installation¶
Download and install Docker
Linux: Docker and Docker Compose
Windows: WSL2, Docker Desktop for Windows with the Docker Desktop WSL2 Backend and Docker Compose
Note for Windows Users: we only support development using Windows 10 and WSL2.
Please ensure that the correct backend is enabled in your docker settings, and run all of the following commands in the wsl
shell.
At the time of writing, we use Ubuntu 20.04
from the Microsoft store as the default distro.
As WSL2 is slow at syncing files between Windows and WSL2 filesystems it is best to checkout the codebase within wsl
itself.
The docker compose cycle script below utilizes Docker Buildx. Depending on the steps above, buildx should be installed alongside docker. If the docker compose cycle invocation below crashes on buildx, it is recommended to (re)install the latest version.
Clone the repo
$ git clone https://github.com/comic/grand-challenge.org
$ cd grand-challenge.org
Set your local docker group id in your
.env
file
$ echo DOCKER_GID=`getent group docker | cut -d: -f3` > .env
You can then start the development site by invoking
$ make runserver
The app/
directory is mounted in the containers,
werkzeug
handles the file monitoring and will restart the process if any changes are detected.
You can also kill the server with CTRL+C
.
The Development Site¶
If you follow the installation instructions above you will be able to go to https://gc.localhost to see the development site, this is using a self-signed certificate so you will need to accept the security warning.
The development site will apply all migrations and add a set of fixtures to help you with developing grand-challenge.org.
These fixtures include Archives, Reader Studies, Challenges, Algorithms and Workstations.
Some default users are created with specific permissions, each user has the same username and password.
These users include archive
, readerstudy
, demo
, algorithm
and workstation
,
who have permission to administer the existing fixtures and create new ones.
If you would like to test out the algorithms you can create a simple algorithm that lists its inputs in a results.json file by running
$ make algorithm_evaluation_fixtures
Before you run
$ make runserver
There is an interactive debugger from django-extensions
which will halt on exceptions (see the RunServerPlus documentation),
it’s really handy for interactive debugging to place 1/0
in your code as a breakpoint.
Running the Tests¶
GitHub actions is used to run the test suite on every new commit. You can also run the tests locally by
In a console window make sure the database is running
$ make runserver
Then in a second window run
$ docker compose run --rm celery_worker pytest -n 2
Replace 2 with the number of CPUs that you have on your system, this runs the tests in parallel.
If you want to add a new test please add them to the app/tests
folder.
If you only want to run the tests for a particular app, eg. for teams
, you can do
$ docker compose run --rm celery_worker pytest -k teams_tests
Development¶
You will need to install pre-commit so that the code is correctly formatted
$ python3 -m pip install pre-commit
Please do all development on a branch and make a pull request to main, this will need to be reviewed before it is integrated.
We recommend using Pycharm for development.
Running through docker compose¶
You will need the Professional edition to use the docker compose integration. To set up the environment in Pycharm Professional 2018.1:
File
->Settings
->Project: grand-challenge.org
->Project Interpreter
->Cog
wheel (top right) ->Add
->Docker Compose
Then select the docker server (usually the unix socket, or Docker for Windows)
Set the service to
web
Click
OK
Set the path mappings:
Local path:
<Project root>/app
Remote path:
/app
Click
OK
Pycharm will then spend some time indexing the packages within the container to help with code completion and inspections. If you edit any files these will be updated on the fly by werkzeug.
PyCharm Configuration¶
It is recommended to setup django integration to ensure that the code completion, tests and import optimisation works.
Open
File
->Settings
->Languages and Frameworks
->Django
Check the
Enable Django Support
checkboxSet the project root to
<Project root>/app
Set the settings to
config/settings.py
Check the
Do not use the django test runner
checkboxIn the settings window navigate to
Tools
->Python integrated tools
Under the testing section select
pytest
as the default test runnerUnder the Docstrings section set
NumPy
as the docstrings formatIn the settings window navigate to
Editor
->Code Style
Click on the
Formatter Control
tab and enableEnable formatter markers in comments
In the settings window navigate to
Editor
->Code Style
->Python
On the
Wrapping and Braces
tab setHard wrap at
to86
andVisual guide
to79
On the
Imports
tab enableSort Import Statements
,Sort imported names in "from" imports
, andSort plain and "from" imports separately in the same group
Click
OK
Install the
Flake8 Support
plugin so that PyCharm will understandnoqa
comments. At the time of writing, the plugin is not compatible with PyCharm 2020. You can still install Flake8 as an external tool though. To do so, follow these steps:Install flake8
pip install flake8
In PyCharm, in the settings window navigate to
Tools
->External Tools
and add a new one with the following configuration:Program: file path to
flake8.exe
you just installedArguments:
$FilePath$
Working directory:
$ProjectFileDir$
In the main window at the top right click the drop down box and then click
Edit Configurations...
Click on
templates
->Python Tests
->pytest
, and enter--reuse-db
in theAdditional Arguments
box andrun --rm
in theCommand and options
box underDocker Compose
It is also recommended to install the black extension for code formatting. You can add it as an external tool, following the same instructions as for Flake8
above.
Creating Migrations¶
If you change a models.py
file then you will need to make the corresponding migration files.
You can do this with
$ make migrations
or, more explicitly
$ docker compose run --rm --user `id -u` web python manage.py makemigrations
add these to git and commit.
Building the documentation¶
Having built the web container with make runserver
you can use this to generate the docs with
$ make docs
This will create the docs in the docs/_build/html
directory.
Adding new dependencies¶
Poetry is used to manage the dependencies of the platform. To add a new dependency use
$ poetry add <whatever>
and then commit the pyproject.toml
and poetry.lock
.
If this is a development dependency then use the --dev
flag, see the poetry
documentation for more details.
Versions are unpinned in the pyproject.toml
file, to update the resolved dependencies use
$ poetry update <whatever>
and commit the updated poetry.lock
.
The containers will need to be rebuilt after running these steps, so stop the make runserver
process with CTRL+C
and restart.
Going to Production¶
The docker compose file included here is for development only. If you want to run this in a production environment you will need to make several changes, not limited to:
Use
gunicorn
rather than runrunserver_plus
to run the web processRemoving the users that are created by
development_fixtures