SolarSPELL Content Curation service (CC). The Content Curation service provides functionality for library curators to submit new content and its associated metadata for approval in multiple languages. A Content Curation Admin has all the abilities of a library curator, and can also approve, modify, or reject outright content submitted for approval. A Content Curation Admin can also add new language categories, create/remove users, and modify their roles to create other admins or content curators.
Install the latest versions of Git, npm, Python, and Postgresql. If you would like to set up your local environment using Docker, navigate to the docker section.
-
Using git, clone the project into a new directory
git clone https://github.com/SolarSPELL-Main/content-curation.git
. -
In the frontend directory, run
npm install
. This installs all the packages npm needs to build the frontend. Then, build the frontend usingnpm run-script build
. -
Back in the main directory, run
pip3 install -r requirements.txt
. Depending on your system, you may need to use pip instead of pip3. This installs all the python dependencies.
As well as with pip
, python3
may be replaced with python
depending on your system. When using bash or similar terminal you may be able to use ./manage.py
instead of python3 manage.py
to save time.
-
Create a PostgreSQL database. If you are using linux, you should do this on another user account created specifically for postgres. You will need to find a postgres tutorial for this on google as it varies differently depending on your OS
-
Copy the file
content_curation/env.example
and rename it tocontent_curation/.env
. If your system file browser doesn't let you do this try with VSCode or other IDE file browser. -
Edit the newly created
content_curation/.env
file, setting SECRET_KEY to a new random string, DATABASE_URL to the connection string of the database you created in step 4. If in a production environment, change DEBUG to False -
Run
python3 manage.py migrate
. This updates the django database to initialize it with the right schema. -
Open a django shell using
python3 manage.py shell
. Enter the following commands. If you aren't running this on localhost, replace 127.0.0.1 with your domain name (ex. example.com)
>>> from django.contrib.sites.models import Site
>>> site = Site.objects.create(domain='127.0.0.1', name='127.0.0.1')
>>> site.save()
(credit https://stackoverflow.com/questions/16068518/django-site-matching-query-does-not-exist)
Enter exit()
to exit the django shell.
-
Create a superuser by using
python3 manage.py createsuperuser
Enter a username, email, and password. This will be your login and password to the admin panel -
Run the server by using
python3 manage.py runsslserver 127.0.0.1:8000
. This is only suitable for a developer server and should be setup differently on production according to your server. -
Login to the admin console under
https://127.0.0.1:8000/admin/
. Make sure that you are using https. Click accept the risk on your browser. Under Social Applications, click add social application on the top right. Select Google for Provider, enter google for name, the client id and secret key can be generated with the Google Cloud Console. If you haven't done this before you will likely need an external tutorial for this. If you are working with others it is possible that others will already have one. You can have multiple people with the same client id and secret. One thing of note is that the Key field on the entry form can be left blank. For Sites make sure that the 127.0.0.1 site or other created during step 8 is selected. -
You can now view the content curation site at
https://127.0.0.1:8000/static/index.html/
Hot Reloading:
cd frontend && npm run watch
- Open up another terminal and run
cd .. && python manage.py runserver 0.0.0.0:8000
This is for setting up a local development environment using docker. It is optional. If you have already installed the app by following the above installation instructions, feel free to skip this section.
This docker setup allows you to automate a few installation steps.
- Clone the repo and build the frontend app same as Step 1, 2 above)
- Install Docker
- Create a file
content_curation/.env
, same as Step 5, 6 above, except modify theDATABASE_URL=postgres://postgres:postgres@db:5432/db
. It is necessary because the DB will be running in a docker container, not the host machine. - Run
cd frontend && npm run watch
to automatically rebuild the frontend files on change. - Open up another terminal, navigate to the root directory of the repo, and run
docker-compose up
. It will take care of Step 3-4, 7-10 automatically for you. - Wait until the log shows the app is running. Set up Google OAuth and start developing as described in Step 11-12, except you should use
localhost
instead of127.0.0.1
in the URL, because that's how it's set up in start.sh. Also usehttp
in place ofhttps
, for SSL is not set up for local development.- Admin site: http://localhost:8000/admin/
- App site: http://localhost:8000/static/index.html/
- You can navigate to pgAdmin
http://localhost:5051/
to view the data in the DB. The pgAdmin's login credentials are specified in thepgadmin
container definition in docker-compose.yml. After successful login, create a server connection to the DB using the information in thedb
container definition.
The API will automatically reload every time the code changes. However, if you wish to do data migration on the existing API docker container web
, you need to execute the migrate commands in the container, instead of directly in the host. That means, you need to log into the container and execute those commands. In fact, any python manage.py
commands should be executed in the container.
You may use docker ps
to get the ID of the running container and use docker exec -it <CONTAINER_ID> bash
to log into it.
Resources: