Skip to content

Commit

Permalink
Merge branch 'main' into hotfix_3.24.2_add_info_to_logs
Browse files Browse the repository at this point in the history
  • Loading branch information
thicham43 authored Nov 26, 2024
2 parents a40675e + e1c2fb7 commit 90ff75d
Show file tree
Hide file tree
Showing 20 changed files with 233 additions and 175 deletions.
2 changes: 1 addition & 1 deletion .github/workflows/commitlint.yml
Original file line number Diff line number Diff line change
Expand Up @@ -10,7 +10,7 @@ jobs:
commit-lint:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v3
- uses: actions/checkout@v4
with:
fetch-depth: 0
- name: Install commit-lint
Expand Down
32 changes: 13 additions & 19 deletions .github/workflows/main.yml
Original file line number Diff line number Diff line change
Expand Up @@ -14,11 +14,11 @@ jobs:
lint:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v3
- uses: actions/checkout@v4
with:
fetch-depth: 0
- name: Set up Python
uses: actions/setup-python@v1
uses: actions/setup-python@v5
with:
python-version: 3.11
- name: Install system deps
Expand Down Expand Up @@ -55,10 +55,12 @@ jobs:
steps:
- name: Checkout code
uses: actions/checkout@v3
uses: actions/checkout@v4
with:
fetch-depth: 0 # Shallow clones should be disabled for a better relevancy of analysis

- name: Set up Python 3.11
uses: actions/setup-python@v4
uses: actions/setup-python@v5
with:
python-version: '3.11'

Expand Down Expand Up @@ -96,19 +98,11 @@ jobs:
coverage xml --omit="*/test*","*/migrations/*" -o coverage.xml
coverage html --omit="*/test*","*/migrations/*" -d htmlcov
# TODO activate this when the project is public
# sonarcloud:
# name: SonarCloud
# runs-on: ubuntu-latest
# steps:
# - uses: actions/checkout@v3
# with:
# fetch-depth: 0 # Shallow clones should be disabled for a better relevancy of analysis
# - name: SonarCloud Scan
# uses: SonarSource/sonarcloud-github-action@master
# env:
# GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} # Needed to get PR information, if any
# SONAR_TOKEN: ${{ secrets.SONAR_TOKEN }}
- name: SonarCloud Scan
uses: SonarSource/sonarcloud-github-action@master
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} # Needed to get PR information, if any
SONAR_TOKEN: ${{ secrets.SONAR_TOKEN }}

publish:
if: github.ref == 'refs/heads/main' || startsWith(github.ref, 'refs/tags/')
Expand All @@ -117,10 +111,10 @@ jobs:

steps:
- name: Checkout code
uses: actions/checkout@v3
uses: actions/checkout@v4

- name: Set up Python 3.11
uses: actions/setup-python@v4
uses: actions/setup-python@v5
with:
python-version: '3.11'

Expand Down
51 changes: 51 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,6 +2,57 @@

All notable changes to this project will be documented in this file.

## [3.24.1] - 2024-11-21

### 🚀 Features

- Add new fhir perimeter sync app (#405)
- Add type and message to maintenance phase (#416)
- *(maintenance)* Add ws event for started and ended maintenance phases (#418)
- Add user info in log records (#422)
- Add request migration script 1.6.0

### 🐛 Bug Fixes

- Adjust CohortRights serializer (#407)
- Hotfix 3.23.2, check user is not anonymous (#408)
- Hotfix 3.23.3 to get perimeters from snapshots (#409)
- Hotfix 3.23.4 impersonate users (#410)
- *(swagger)* Remove clientSecret setting
- *(static)* Remove old static files
- Hotfix 3.23.5 exports (#411)
- *(exports)* Hotfix 3.23.7 notifs and files extensions (#413)
- Hotfix 3.23.8 exports in one file (#414)
- Hotfix 3.23.9 downloading xlsx/csv exports (#415)
- Fhir perimeter source type + django max request line
- *(cohort)* USE_SOLR boolean matching
- Plug in actual cohort operators
- Feasibility study serializers
- *(exports)* Set file extension to .zip
- Add traceId header
- Remove extra arg
- *(exports)* Do not create sub-cohort for measurement table
- *(migrationscripts)* Add resource type to filter + fix basic resource postprocess
- *(migrationscript)* Add fix request migration script
- Hotfix 3.23.16 xlsx exports right verif (#424)
- *(accesses)* Get only perimeters with defined rights

### 🚜 Refactor

- *(ws)* Move websocket manager to main module (#417)
- Settings per app config (#420)
- Update download url and add serializers
- Always download a zip file

### 📚 Documentation

- Update changelog

### ⚙️ Miscellaneous Tasks

- Set version 3.24.0-SNAPSHOT
- Get project version from settings

## [3.23.10] - 2024-09-02

### 🐛 Bug Fixes
Expand Down
2 changes: 2 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,3 +1,5 @@
[![Actions Status](https://github.com/aphp/Cohort360-Back-end/workflows/main/badge.svg)](https://github.com/aphp/Cohort360-Back-end/actions)
[![Quality Gate](https://sonarcloud.io/api/project_badges/measure?project=aphp_Cohort360-Back-end&metric=alert_status)](https://sonarcloud.io/dashboard?id=aphp_Cohort360-Back-end)
![image](https://img.shields.io/badge/Python-3.11-blue/?color=blue&logo=python&logoColor=9cf)
![image](https://img.shields.io/badge/Django-5.0-%2344b78b/?color=%2344b78b&logo=django&logoColor=green)

Expand Down
2 changes: 1 addition & 1 deletion admin_cohort/models/user.py
Original file line number Diff line number Diff line change
Expand Up @@ -20,7 +20,7 @@ def __repr__(self):
return f"User {self})"

@property
def display_name(self):
def display_name(self) -> str:
deleted_suffix = self.delete_datetime and " (Supprimé)" or ""
return f"{self}{deleted_suffix}"

Expand Down
2 changes: 1 addition & 1 deletion admin_cohort/settings.py
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@
from celery.schedules import crontab

TITLE = "Portail/Cohort360 API"
VERSION = "3.24.2"
VERSION = "3.25.0-SNAPSHOT"
AUTHOR = "Assistance Publique - Hopitaux de Paris, Département I&D"
DESCRIPTION_MD = f"""Supports the official **Cohort360** web app and **Portail**
Built by **{AUTHOR}**
Expand Down
29 changes: 0 additions & 29 deletions exporters/base_exporter.py
Original file line number Diff line number Diff line change
@@ -1,13 +1,11 @@
import logging
import time
from typing import List

from django.utils import timezone
from requests import RequestException

from admin_cohort.models import User
from admin_cohort.types import JobStatus
from cohort.models import CohortResult
from exports.emails import check_email_address
from exports.models import Export, Datalab, ExportTable
from exports.services.rights_checker import rights_checker
Expand All @@ -28,38 +26,11 @@ def __init__(self):

def validate(self, export_data: dict, **kwargs) -> None:
owner = kwargs["owner"]
self.validate_tables_data(tables_data=export_data.get("export_tables", []))
check_email_address(owner.email)
self.check_user_rights(export_data=export_data, **kwargs)
export_data['request_job_status'] = JobStatus.validated
self.complete_data(export_data=export_data, owner=owner)

@staticmethod
def using_new_export_models(export_data: dict) -> bool:
# todo: 2b removed once starting to use new models
return "export_tables" in export_data

def validate_tables_data(self, tables_data: List[dict]) -> bool:
required_table = self.export_api.required_table
base_cohort_provided = False
required_table_provided = False
for table_data in tables_data:
source_cohort_id = table_data.get('cohort_result_source')

if required_table in table_data.get("table_ids"):
required_table_provided = True
if not source_cohort_id:
raise ValueError(f"The `{required_table}` table can not be exported without a source cohort")

if source_cohort_id:
if CohortResult.objects.filter(pk=source_cohort_id, request_job_status=JobStatus.finished).exists():
base_cohort_provided = True
else:
raise ValueError(f"Cohort `{source_cohort_id}` not found or did not finish successfully")

if not required_table_provided and not base_cohort_provided:
raise ValueError(f"`{required_table}` table was not specified; must then provide source cohort for all tables")
return True

@staticmethod
def check_user_rights(export_data: dict, **kwargs) -> None:
Expand Down
22 changes: 6 additions & 16 deletions exporters/csv_exporter.py
Original file line number Diff line number Diff line change
@@ -1,5 +1,4 @@
import os
from typing import List

from django.db.models import Q

Expand All @@ -17,24 +16,15 @@ def __init__(self):
self.type = ExportTypes.CSV.value
self.target_location = os.environ.get('EXPORT_CSV_PATH')

@staticmethod
def get_source_cohorts(export_data: dict, **kwargs) -> List[str]:
source_cohorts_ids = [t.get("cohort_result_source")
for t in export_data.get("export_tables", []) if t.get("cohort_result_source")]
if len(set(source_cohorts_ids)) != 1:
raise ValueError("All export tables must have the same source cohort")
source_cohort_id = source_cohorts_ids[0]
if not CohortResult.objects.filter(Q(pk=source_cohort_id) &
Q(owner=kwargs.get("owner")))\
.exists():
raise ValueError(f"Missing cohort with id {source_cohort_id}")
return source_cohorts_ids

def validate(self, export_data: dict, **kwargs) -> None:
if not export_data.get('nominative', False):
raise ValueError("Export must be in `nominative` mode")
source_cohorts_ids = self.get_source_cohorts(export_data=export_data, owner=kwargs.get("owner"))
kwargs["source_cohorts_ids"] = source_cohorts_ids
source_cohort_id = export_data.get("cohort_result_source")
if not CohortResult.objects.filter(Q(pk=source_cohort_id) &
Q(owner=kwargs.get("owner")))\
.exists():
raise ValueError(f"Missing cohort with ID {source_cohort_id}")
kwargs["source_cohorts_ids"] = [source_cohort_id]
super().validate(export_data=export_data, **kwargs)

def complete_data(self, export_data: dict, owner: User, **kwargs) -> None:
Expand Down
26 changes: 26 additions & 0 deletions exporters/hive_exporter.py
Original file line number Diff line number Diff line change
@@ -1,8 +1,11 @@
import logging
import os
from typing import List

from requests import RequestException

from admin_cohort.types import JobStatus
from cohort.models import CohortResult
from exports.models import Export
from exporters.base_exporter import BaseExporter
from exporters.enums import ExportTypes
Expand All @@ -20,11 +23,34 @@ def __init__(self):
self.user = os.environ.get('HIVE_EXPORTER_USER')

def validate(self, export_data: dict, **kwargs) -> None:
self.validate_tables_data(tables_data=export_data.get("export_tables", []))
kwargs["source_cohorts_ids"] = [t.get("cohort_result_source")
for t in export_data.get("export_tables", [])
if t.get("cohort_result_source")]
super().validate(export_data=export_data, **kwargs)

def validate_tables_data(self, tables_data: List[dict]) -> bool:
required_table = self.export_api.required_table
base_cohort_provided = False
required_table_provided = False
for td in tables_data:
source_cohort_id = td.get('cohort_result_source')

if td.get("table_name", "") == required_table:
required_table_provided = True
if not source_cohort_id:
raise ValueError(f"The `{required_table}` table can not be exported without a source cohort")

if source_cohort_id:
if CohortResult.objects.filter(pk=source_cohort_id, request_job_status=JobStatus.finished).exists():
base_cohort_provided = True
else:
raise ValueError(f"Cohort `{source_cohort_id}` not found or did not finish successfully")

if not required_table_provided and not base_cohort_provided:
raise ValueError(f"`{required_table}` table was not specified; must then provide source cohort for all tables")
return True

def handle_export(self, export: Export, **kwargs) -> None:
self.confirm_export_received(export=export)
self.prepare_db(export)
Expand Down
Empty file removed exporters/migrations/__init__.py
Empty file.
49 changes: 4 additions & 45 deletions exporters/tests/test_base_exporter.py
Original file line number Diff line number Diff line change
Expand Up @@ -9,8 +9,6 @@ class TestBaseExporter(ExportersTestBase):

def setUp(self) -> None:
super().setUp()
self.person_table_name = "person"
self.cohorts = [self.cohort, self.cohort2]
self.api_conf = {
'API_URL': 'https://exports-api.fr/api',
'CSV_EXPORT_ENDPOINT': '/csv',
Expand All @@ -27,53 +25,14 @@ def setUp(self) -> None:
mock_exports_config.EXPORT_API_CONF = self.api_conf
self.exporter = BaseExporter()

def test_validate_tables_data_all_tables_have_source_cohort(self):
# all tables have a linked source cohort
tables_data = [{"table_ids": [self.person_table_name], "cohort_result_source": self.cohorts[0].uuid},
{"table_ids": ["other_table_01"], "cohort_result_source": self.cohorts[1].uuid}]
check = self.exporter.validate_tables_data(tables_data=tables_data)
self.assertTrue(check)

def test_validate_tables_data_only_person_table_has_source_cohort(self):
# only `person` table has a linked source cohort, the other tables don't
tables_data = [{"table_ids": ["table_01"]},
{"table_ids": [self.person_table_name], "cohort_result_source": self.cohorts[0].uuid},
{"table_ids": ["table_02"]}]
check = self.exporter.validate_tables_data(tables_data=tables_data)
self.assertTrue(check)

def test_validate_tables_data_one_table_with_source_cohort(self):
# tables data is valid if the source cohort is provided within the table data
tables_data = [{"table_ids": ["table_01"], "cohort_result_source": self.cohorts[0].uuid}]
check = self.exporter.validate_tables_data(tables_data=tables_data)
self.assertTrue(check)

def test_validate_tables_data_missing_source_cohort_for_person_table(self):
# tables tada is not valid if the `person` table dict is in the list but missing the source cohort
tables_data = [{"table_ids": [self.person_table_name]},
{"table_ids": ["table_01"], "cohort_result_source": self.cohorts[0].uuid}]
with self.assertRaises(ValueError):
self.exporter.validate_tables_data(tables_data=tables_data)

def test_validate_tables_data_with_only_person_table_without_source_cohort(self):
# tables data is not valid if the `person` table has no source cohort
tables_data = [{"table_ids": [self.person_table_name]}]
with self.assertRaises(ValueError):
self.exporter.validate_tables_data(tables_data=tables_data)

def test_validate_tables_data_all_tables_without_source_cohort_nor_person_table(self):
# tables data is not valid if the `person` table has no source cohort
tables_data = [{"table_ids": ["table_01"]},
{"table_ids": ["table_02"]}]
with self.assertRaises(ValueError):
self.exporter.validate_tables_data(tables_data=tables_data)

def test_complete_export_data(self):
export_data = dict(output_format=ExportTypes.HIVE.value,
datalab=self.datalab.pk,
nominative=True,
motivation='motivation\nover\nmulti lines',
tables=[{"omop_table_name": "table1"}]
motivation='motivation\nover\nmultiple\nlines',
export_tables=[{"table_name": "table1",
"cohort_result_source": self.cohort.uuid
}]
)
self.exporter.complete_data(export_data=export_data, owner=self.csv_exporter_user)
self.assertIn("owner", export_data)
Expand Down
Loading

0 comments on commit 90ff75d

Please sign in to comment.