Grand-challenge is distributed as a set of containers that are defined and linked together in
To develop the platform you need to have docker and docker-compose running on your system.
Download and install Docker
Clone the repo
$ git clone https://github.com/comic/grand-challenge.org $ cd grand-challenge.org
Add the following to your
You can then start the site by invoking
You can then navigate to https://gc.localhost in your browser to see the development site,
this is using a self-signed certificate so you will need to accept the security warning.
app/ directory is mounted in the containers,
werkzeug handles the file monitoring and will restart the process if any changes are detected.
If you need to manually restart the process you can do this when running
cycle_docker_compose.sh by pressing
CTRL+D in the console window,
you can also kill the server with
Running Grand-Challenge within a Windows environment requires additional steps before invoking the
Makefor an available
Set an environment variable to enable Windows path conversions for Docker
$ export COMPOSE_CONVERT_WINDOWS_PATHS=1
Add the following line to your hosts file (
# Using Docker for Windows: 127.0.0.1 gc.localhost 127.0.0.1 minio-protected # Using Docker Toolbox: 192.168.99.100 gc.localhost 192.168.99.100 minio-protected
Share the drive where this repository is located with Docker
Docker for Windows
Right-click Docker icon in taskbar and click “Settings”
Go to “Shared drives” tab
Mark the checkbox of the drive where this repository is located
Go to the virtual machine that belongs to docker
Double click “Shared folders”
Click on the “Add New Shared Folder” button on the right
In the Folder Path box, type the drive letter where this repository is located (eg.
In the Folder Name box, type the drive letter lowercased (eg.
Restart the docker machine by typing
docker-machine restartin your console
SSH into the docker VM with
Append the following lines to the file
mkdir /home/docker/c # Change the 'c' to your drive letter sudo mount -t vboxsf -o uid=1000,gid=50 c /home/docker/c # Again, change both 'c's to your drive letter
Running the Tests¶
GitHub actions is used to run the test suite on every new commit. You can also run the tests locally by
In a console window make sure the database is running
Then in a second window run
$ docker-compose run --rm web pytest -n 2
Replace 2 with the number of CPUs that you have on your system, this runs the tests in parallel.
If you want to add a new test please add them to the
If you only want to run the tests for a particular app, eg. for
teams, you can do
$ docker-compose run --rm web pytest -k teams_tests
You will need to install pre-commit so that the code is correctly formatted
$ python3 -m pip install pre-commit
Please do all development on a branch and make a pull request to master, this will need to be reviewed before it is integrated.
We recommend using Pycharm for development.
Running through docker-compose¶
You will need the Professional edition to use the docker-compose integration. To set up the environment in Pycharm Professional 2018.1:
Cogwheel (top right) ->
Then select the docker server (usually the unix socket)
Set the service to
Set the path mappings from
Pycharm will then spend some time indexing the packages within the container to help with code completion and inspections. If you edit any files these will be updated on the fly by werkzeug.
It is recommended to setup django integration to ensure that the code completion, tests and import optimisation works.
Languages and Frameworks->
Enable Django Supportcheckbox
Set the project root to
Set the settings to
Do not use the django test runnercheckbox
In the settings window navigate to
Python integrated tools
Under the testing section select
pytestas the default test runner
Under the Docstrings section set
NumPyas the docstrings format
In the settings window navigate to
Click on the
Formatter Controltab and enable
Enable formatter markers in comments
In the settings window navigate to
Wrapping and Bracestab set
Hard wrap atto
Sort Import Statements,
Sort imported names in "from" imports, and
Sort plain and "from" imports separately in the same group
Flake8 Supportplugin so that PyCharm will understand
It is also recommended to install the black extension (version
19.10b0) for code formatting.
Alternatively, it can be useful to run code from a local python environment - this allows for easier debugging and does
not require e.g. the professional edition of PyCharm. The setup described here uses all services from the normal
docker-compose stack, except for the web service. Though this service is running, a separate Django dev server is
started in PyCharm (or from the terminal). As the dev server is running on port
8000 by default, there is no port conflict
with the service running in the docker container.
docker-composestack for the database and celery task handling
Make sure you have
In a new terminal, create a new virtual python environment using
poetry installin this repository’s root folder.
Activate the virtual env:
Load the environmental variables contained in
$ export $(cat .env.local | egrep -v "^#" | xargs)
Run migrations and check_permissions (optionally load demo data).
$ cd app $ python manage.py migrate $ python manage.py check_permissions $ python manage.py init_gc_demo
You can now start the server using
python manage.py runserver_plus.
To setup PyCharm:
Project Interpreter-> Select your created virtual environment
For each run/debug configuration, make sure the environmental variables are loaded, the easiest is to use this plugin. Or they can be pasted after pressing the folder icon in the
Useful to setup: the built-in python/django console in Pycharm:
Console-> Python/Django console. Choose the same python interpreter here, and make sure to load the environmental variables (the .env plugin cannot be used here, the variables can only be pasted).
If you change a
models.py file then you will need to make the corresponding migration files.
You can do this with
$ make migrations
or, more explicitly
$ docker-compose run --rm --user `id -u` web python manage.py makemigrations
add these to git and commit.
Building the documentation¶
Having built the web container with
cycle_docker_compose.sh you can use this to generate the docs with
$ make docs
This will create the docs in the
Adding new dependencies¶
Poetry is used to manage the dependencies of the platform. To add a new dependency use
$ poetry add <whatever>
and then commit the
If this is a development dependency then use the
--dev flag, see the
poetry documentation for more details.
Versions are unpinned in the
pyproject.toml file, to update the resolved dependencies use
$ poetry lock
and commit the update
The containers will need to be rebuilt after running these steps, so stop the
cycle_docker_compose.sh process with
CTRL+C and restart.
Going to Production¶
The docker compose file included here is for development only. If you want to run this in a production environment you will need to make several changes, not limited to:
gunicornrather than run
runserver_plusto run the web process
Removing the users that are created by