A Django project boilerplate/template with a multitude of state-of-the-art libraries and tools. If pairing Django with React is a possibility for your project or spinoff, this is the best solution available. Save time with tools like:
- React, for building interactive UIs
- Poetry, for managing the environment and its dependencies
- django-js-reverse, for generating URLs on JS
- React Bootstrap, for responsive styling
- Webpack, for bundling static assets
- Celery, for background worker tasks
- WhiteNoise with brotlipy, for efficient static files serving
- ruff and ESLint with pre-commit for automated quality assurance (does not replace proper testing!)
For continuous integration, a Github Action configuration .github/workflows/main.yml
is included.
Also, includes a Render.com render.yaml
and a working Django production.py
settings, enabling easy deployments with 'Deploy to Render' button. The render.yaml
includes the following:
- PostgreSQL, for DB
- Redis, for Celery
react
for building interactive UIsreact-dom
for rendering the UIreact-router
for page navigationwebpack
for bundling static assetswebpack-bundle-tracker
for providing the bundled assets to Django- Styling
bootstrap
for providing responsive stylesheetsreact-bootstrap
for providing components built on top of Bootstrap CSS without using pluginssass
for providing compatibility with SCSS files
- State management and backend integration
axios
for performing asynchronous callscookie
for easy integration with Django using thecsrftoken
cookie@reduxjs/toolkit
for easy state management across the application with the whole toolkit including devtools for inspecting and debugging Redux via browser and ability to run thunks for interacting with the Redux store through asynchronous logicconnected-react-router
for integrating Redux with React Routerhistory
for providing browser history to Connected React Routerreact-redux
for integrating React with Redux
- Utilities
lodash
for general utility functionsclassnames
for easy working with complex CSS class names on componentsreact-refresh
for improving QoL while developing through automatic browser refreshing
django
for building backend logic using Pythondjangorestframework
for building a REST API on top of Djangodjango-webpack-loader
for rendering the bundled frontend assetsdjango-js-reverse
for easy handling of Django URLs on JSpsycopg2
for using PostgreSQL databasesentry-sdk
for error monitoringpython-decouple
for reading environment variables on settings filescelery
for background worker tasksdjango-debreach
for additional protection against BREACH attackwhitenoise
andbrotlipy
for serving static assets
Several people have leveraged our boilerplate to start spinoffs or to boost their efforts in the challenging pursuit of securing funding. Starting with a solid foundation allows you to create more resilient products and focus on what really matters: discovering and delivering value to your customers. If you are one of those people, we're eager to help you even more! We can spread the word about your project across our social media platforms, giving you access to a broader audience.
Send us an email at [email protected] telling us a bit more about how our boilerplate helped you boost your project.
- Setup editorconfig, ruff and ESLint in the text editor you will use to develop.
- Do the following:
- Create a git-untracked
local.py
settings file:cp backend/new_site_dbp/settings/local.py.example backend/new_site_dbp/settings/local.py
- Create a git-untracked
.env.example
file:cp backend/.env.example backend/.env
- Create a git-untracked
- Open the
backend/.env
file on a text editor and uncomment the lineDATABASE_URL=postgres://new_site_dbp:password@db:5432/new_site_dbp
- Open a new command line window and go to the project's directory
- Run the initial setup:
make docker_setup
- Create the migrations for
users
app:
make docker_makemigrations
- Run the migrations:
make docker_migrate
- Run the project:
make docker_up
- Access
http://localhost:8000
on your browser and the project should be running there- When you run
make docker_up
, some containers are spinned up (frontend, backend, database, etc) and each one will be running on a different port - The container with the React app uses port 3000. However, if you try accessing it on your browser, the app won't appear there and you'll probably see a blank page with the "Cannot GET /" error
- This happens because the container responsible for displaying the whole application is the Django app one (running on port 8000). The frontend container is responsible for providing a bundle with its assets for django-webpack-loader to consume and render them on a Django template
- When you run
- To access the logs for each service, run:
make docker_logs <service name>
(eitherbackend
,frontend
, etc) - To stop the project, run:
make docker_down
- Open a new command line window and go to the project's directory
- Update the dependencies management files by performing any number of the following steps:
- To add a new frontend dependency, run
npm install <package name> --save
The above command will update your
package.json
, but won't make the change effective inside the container yet - To add a new backend dependency, run
docker compose run backend --rm bash
to open an interactive shell and then runpoetry add {dependency}
to add the dependency. If the dependency should be only available for development user append-G dev
to the command. - After updating the desired file(s), run
make docker_update_dependencies
to update the containers with the new dependenciesThe above command will stop and re-build the containers in order to make the new dependencies effective
- To add a new frontend dependency, run
- Open a new command line window and go to the project's directory
npm install
npm run dev
- This is used to serve the frontend assets to be consumed by django-webpack-loader and not to run the React application as usual, so don't worry if you try to check what's running on port 3000 and see an error on your browser
- Open the
backend/.env
file on a text editor and do one of the following:- If you wish to use SQLite locally, uncomment the line
DATABASE_URL=sqlite:///backend/db.sqlite3
- If you wish to use PostgreSQL locally, uncomment and edit the line
DATABASE_URL=postgres://new_site_dbp:password@db:5432/new_site_dbp
in order to make it correctly point to your database URL- The url format is the following:
postgres://USER:PASSWORD@HOST:PORT/NAME
- The url format is the following:
- If you wish to use another database engine locally, add a new
DATABASE_URL
setting for the database you wish to use- Please refer to dj-database-url on how to configure
DATABASE_URL
for commonly used engines
- Please refer to dj-database-url on how to configure
- If you wish to use SQLite locally, uncomment the line
- Open a new command line window and go to the project's directory
- Run
poetry install
- Go to the
backend
directory - Create the migrations for
users
app:poetry run python manage.py makemigrations
- Run the migrations:
poetry run python manage.py migrate
- Run the project:
poetry run python manage.py runserver
- Open a browser and go to
http://localhost:8000
to see the project running
poetry run celery --app=myproject worker --loglevel=info
- For development, we use Mailhog to test our e-mail workflows, since it allows us to inspect the messages to validate they're correctly built
- Docker users already have it setup and running once they start the project
- For non-Docker users, please have a look here for instructions on how to setup Mailhog on specific environments
The project expects Mailhog SMTP server to be running on port 1025, you may alter that by changing
EMAIL_PORT
on settings
make test
Will run django tests using --keepdb
and --parallel
. You may pass a path to the desired test module in the make command. E.g.:
make test someapp.tests.test_views
To add a new backend dependency, run poetry add {dependency}
. If the dependency should be only available for development user append -G dev
to the command.
To enable Continuous Integration through Github Actions, we provide a proj_main.yml
file. To connect it to Github you need to rename it to main.yml
and move it to the .github/workflows/
directory.
You can do it with the following commands:
mkdir -p .github/workflows
mv proj_main.yml .github/workflows/main.yml
This project comes with an render.yaml
file, which can be used to create an app on Render.com from a GitHub repository.
Before deploying, please make sure you've generated an up-to-date poetry.lock
file containing the Python dependencies. This is necessary even if you've used Docker for local runs. Do so by following these instructions.
After setting up the project, you can init a repository and push it on GitHub. If your repository is public, you can use the following button:
If you are in a private repository, access the following link replacing $YOUR_REPOSITORY_URL$
with your repository link.
https://render.com/deploy?repo=$YOUR_REPOSITORY_URL$
Keep reading to learn how to configure the prompted environment variables.
Chances are your project name isn't unique in Render, and you'll get a randomized suffix as your full app URL like: https://new_site_dbp-a1b2.onrender.com
.
But this will only happen after the first deploy, so you are not able to properly fill ALLOWED_HOSTS
yet. Simply set it to *
then fix it later to something like new_site_dbp-a1b2.onrender.com
and your domain name like example.org
.
Default is 1, meaning the build script will run collectstatic during deploys.
Default is 1, meaning the build script will run collectstatic during deploys.
By default, the project will always run the render_build.sh
script during deployments. This script does the following:
- Build the frontend
- Build the backend
- Run Django checks
- Run
collectstatic
- Run Django migrations
- Push frontend source maps to Sentry
As there aren't free plans for Workers in Render.com, the configuration for Celery workers/beat will be commented by default in the render.yaml
. This means celery won't be available by default.
Uncommenting the worker configuration lines on render.yaml
will imply in costs.
To enable sending emails from your application you'll need to have a valid SendGrid account and also a valid verified sender identity. After finishing the validation process you'll be able to generate the API credentials and define the SENDGRID_USERNAME
and SENDGRID_PASSWORD
environment variables on Render.com.
These variables are required for your application to work on Render.com since it's pre-configured to automatically email admins when the application is unable to handle errors gracefully.
Media files integration with S3 or similar is not supported yet. Please feel free to contribute!
Sentry is already set up on the project. For production, add SENTRY_DSN
environment variable on Render.com, with your Sentry DSN as the value.
You can test your Sentry configuration by deploying the boilerplate with the sample page and clicking on the corresponding button.
The render_build.sh
script has a step to push Javascript source maps to Sentry, however some environment variables need to be set on Render.com.
The environment variables that need to be set are:
SENTRY_ORG
- Name of the Sentry Organization that owns your Sentry Project.SENTRY_PROJECT_NAME
- Name of the Sentry Project.SENTRY_API_KEY
- Sentry API key that needs to be generated on Sentry. You can find or create authentication tokens within Sentry.
After enabling dyno metadata and setting the environment variables, your next Render.com Deploys will create a release on Sentry where the release name is the commit SHA, and it will push the source maps to it.
- At pre-commit time (see below)
- Manually with
poetry run ruff
andnpm run lint
on project root. - During development with an editor compatible with ruff and ESLint.
- Not supported yet. Please feel free to contribute!
- On project root, run
poetry run pre-commit install
to enable the hook into your git repo. The hook will run automatically for each commit.
Some settings defaults were decided based on Vinta's experiences. Here's the rationale behind them:
-
Using atomic requests in production prevents several database consistency issues. Check Django docs for more details.
-
Important: When you are queueing a new Celery task directly from a Django view, particularly with little or no delay/ETA, it is essential to use
transaction.on_commit(lambda: my_task.delay())
. This ensures that the task is only queued after the associated database transaction has been successfully committed.- If
transaction.on_commit
is not utilized, or if a significant delay is not set, you risk encountering race conditions. In such scenarios, the Celery task might execute before the completion of the request's transaction. This can lead to inconsistencies and unexpected behavior, as the task might operate on a database state that does not yet reflect the changes made in the transaction. Read more about this problem on this article.
- If
- We believe Celery tasks should be idempotent. So for us it's safe to set
CELERY_ACKS_LATE = True
to ensure tasks will be re-queued after a worker failure. Check Celery docs on "Should I use retry or acks_late?" for more info.
If you wish to contribute to this project, please first discuss the change you wish to make via an issue.
Check our contributing guide to learn more about our development process and how you can test your changes to the boilerplate.
This project is maintained by Vinta Software and is used in products of Vinta's clients. We are always looking for exciting work! If you need any commercial support, feel free to get in touch: [email protected]