Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

hotfix #376

Open
wants to merge 29 commits into
base: master
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
29 commits
Select commit Hold shift + click to select a range
e0afcf1
hotfix
robertavram Nov 4, 2024
31b1c29
Update export.py
robertavram Nov 4, 2024
1a0bd47
Update export.py
robertavram Nov 4, 2024
a66f730
Update __init__.py
robertavram Nov 4, 2024
ce8e921
make groups filter conssistent
Nov 7, 2024
3b00a52
Update CHANGES
robertavram Nov 7, 2024
72a7855
Update __init__.py
robertavram Nov 7, 2024
835fe0a
Merge branch 'fix_user_extra_filter_by_role' of https://github.com/un…
robertavram Nov 7, 2024
b2a5afc
Merge pull request #378 from unicef/fix_user_extra_filter_by_role
robertavram Nov 7, 2024
ddf9abc
further improve celery task failure reporting
Nov 5, 2024
da67f7a
improve code to make it more pythonic
Nov 5, 2024
52e91d3
bump up version
Nov 7, 2024
678c81d
Merge pull request #377 from unicef/improve_celery_task_failire_report
hbarisik Nov 7, 2024
7994be6
update container tag logic
Nov 8, 2024
ef24205
update version capture logic
Nov 10, 2024
765b8d5
fix issues in commands
Nov 10, 2024
3d0fd50
add more image tags and pushes
Nov 11, 2024
368eac1
add version based push after BASE_TAGE
Nov 11, 2024
8c6c936
attept to discover version
Nov 11, 2024
b632fcf
attempt-2 to discover version
Nov 11, 2024
f44f44b
make candiate tag with version
Nov 11, 2024
dec886e
Merge pull request #379 from unicef/ci_image_build_and_tagging
robertavram Nov 12, 2024
f5f04bd
update the formulate for pending_unsupported_amount
Nov 14, 2024
0f56235
bump up version and update CHANGES
Nov 14, 2024
f24645b
Merge pull request #380 from unicef/fix_unsupported_amount_formula
hbarisik Nov 14, 2024
06fb7a9
extract latitide and longitude from point and geom fields if they hap…
Dec 18, 2024
82d1289
Merge pull request #384 from unicef/extract_latitude_longitude
hbarisik Dec 19, 2024
a37ae8a
fix logical error regarding longitude function
Dec 19, 2024
201c0a0
Merge pull request #385 from unicef/extract_latitude_longitude
hbarisik Dec 19, 2024
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
34 changes: 29 additions & 5 deletions .circleci/config.yml
Original file line number Diff line number Diff line change
Expand Up @@ -21,6 +21,14 @@ jobs:
else
docker build -t unicef/datamart:$BASE_TAG -f Dockerfile-installed .
docker push unicef/datamart:$BASE_TAG
docker build --build-arg BASE_TAG=$BASE_TAG -t unicef/datamart:$TAG .
echo "Current directory: $(pwd)"
echo "Files in current directory: $(ls -l)"
VERSION=$(grep 'VERSION =' src/etools_datamart/__init__.py | awk '{print $5}')
VERSION=${VERSION//\"}
echo "VERSION value is $VERSION"
docker tag unicef/datamart:$TAG unicef/datamart:"cand-tag-$VERSION"
docker push unicef/datamart:"cand-tag-$VERSION"
if (echo "develop" | grep -q "$CIRCLE_BRANCH"); then
docker tag unicef/datamart:$BASE_TAG unicef/datamart:latest
docker push unicef/datamart:latest
Expand Down Expand Up @@ -107,18 +115,34 @@ jobs:
- run:
name: Pushing to Docker Hub
command: |
echo "Current directory: $(pwd)"
echo "Files in current directory: $(ls -l)"
VERSION=$(grep 'VERSION =' src/etools_datamart/__init__.py | awk '{print $5}')
VERSION=${VERSION//\"}
echo "VERSION value is $VERSION"
TAG=${CIRCLE_BRANCH}
BASE_TAG="$(md5sum Pipfile.lock | cut -c1-6)$(md5sum Dockerfile-installed | cut -c1-6)"
docker login -u $DOCKER_USER -p $DOCKER_PASS
echo "TAG value is $TAG"
docker push unicef/datamart:$TAG
docker tag unicef/datamart:$TAG unicef/datamart:"tag-$VERSION"
docker push unicef/datamart:"tag-$VERSION"
docker tag unicef/datamart:$BASE_TAG unicef/datamart:"base_tag-$VERSION"
docker push unicef/datamart:"base_tag-$VERSION"
if (echo "develop" | grep -q "$CIRCLE_BRANCH"); then
docker tag unicef/datamart:$BASE_TAG unicef/datamart:latest
docker push unicef/datamart:latest
docker tag unicef/datamart:$TAG unicef/datamart:"dev-$VERSION"
docker push unicef/datamart:"dev-$VERSION"
docker tag unicef/datamart:$TAG unicef/datamart:dev-latest
docker push unicef/datamart:dev-latest
docker tag unicef/datamart:$BASE_TAG unicef/datamart:base-dev-latest
docker push unicef/datamart:base-dev-latest
elif (echo "master" | grep -q "$CIRCLE_BRANCH"); then
docker tag unicef/datamart:$BASE_TAG unicef/datamart:latest_prod
docker tag unicef/datamart:$TAG unicef/datamart:"prod-$VERSION"
docker push unicef/datamart:"prod-$VERSION"
docker tag unicef/datamart:$TAG unicef/datamart:latest_prod
docker push unicef/datamart:latest_prod
docker tag unicef/datamart:$BASE_TAG unicef/datamart:4.7.6
docker push unicef/datamart:4.7.6
docker tag unicef/datamart:$BASE_TAG unicef/datamart:base-prod-latest
docker push unicef/datamart:base-prod-latest
else
echo "Not a followed branch not pushing latest"
fi
Expand Down
34 changes: 34 additions & 0 deletions CHANGES
Original file line number Diff line number Diff line change
@@ -1,3 +1,37 @@
4.7.14
----
* Fix logic error extract_longitude function

4.7.13
----
* Improved FMQuestion and FMOntrack loaders to extract latitude and longitude from
Location.point when Location.longitude and Location.latitude.

4.7.12
----
Formula for SpotCheckFindings.pending_unsupported_amount has been updated as below;
= audit_spotcheck.total_amount_of_ineligible_expenditure
audit_engagement.additional_supporting_documentation_provided
audit_engagement.justification_provided_and_accepted
audit_engagement.write_off_required
audit_engagement.amount_refunded

4.7.11
----
CI pipline improvements for version tagging.

4.7.10
----
Further improve Celery task failure error report

4.7.9
----
Make groups parameter name consistent


----
Hotfix for export access log to handle non-standard date format

4.7.6
----
* Missing dependencies for FMQuestionLoader, FMOntrack and LocationsiteLoader have been fixed.
Expand Down
2 changes: 1 addition & 1 deletion src/etools_datamart/__init__.py
Original file line number Diff line number Diff line change
@@ -1,3 +1,3 @@
NAME = "etools-datamart"
VERSION = __version__ = "4.7.6"
VERSION = __version__ = "4.7.14"
__author__ = ""
2 changes: 1 addition & 1 deletion src/etools_datamart/api/filtering.py
Original file line number Diff line number Diff line change
Expand Up @@ -198,7 +198,7 @@ def to_html(self, request, queryset, view):


class GroupNameFilter(BaseFilterBackend):
query_param = "group"
query_param = "groups"
template = "api/group_filter.html"

def get_query(self, request):
Expand Down
2 changes: 1 addition & 1 deletion src/etools_datamart/api/templates/api/group_filter.html
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,7 @@
{#<h2>{% trans "Group" %}</h2>#}
<div class="list-group">
<label for="id_vendor_number" class="control-label ">Groups</label>
<select name="group_name" multiple="multiple" data-allowclear="true" class="select2" style="width: 100%">
<select name="groups" multiple="multiple" data-allowclear="true" class="select2" style="width: 100%">
{% for group in groups %}<option {% if group in selection %}selected="selected"{% endif %}>{{ group }}</option>
{% endfor %}
</select>
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -69,6 +69,7 @@ def get_pending_unsupported_amount(self, record, values, field_name):
- record.additional_supporting_documentation_provided
- record.justification_provided_and_accepted
- record.write_off_required
- record.amount_refunded
)


Expand Down
30 changes: 26 additions & 4 deletions src/etools_datamart/apps/mart/data/models/fm_questions.py
Original file line number Diff line number Diff line change
Expand Up @@ -25,6 +25,28 @@
logger = get_task_logger(__name__)


def extract_latitude(location):
if location.latitude is not None:
return location.latitude
elif location.point is not None:
return location.point.y
elif location.geom is not None:
return location.centroid.y
else:
return None


def extract_longitude(location):
if location.longitude is not None:
return location.longitude
elif location.point is not None:
return location.point.x
elif location.geom is not None:
return location.centroid.x
else:
return None


class FMQuestionLoader(EtoolsLoader):
"""Loader for FM Questions"""

Expand Down Expand Up @@ -244,8 +266,8 @@ def get_location(self, record: FieldMonitoringDataCollectionFinding, values: dic
"admin_level": instance.admin_level,
"source_id": instance.source_id,
"location_type": instance.admin_level_name,
"latitude": instance.latitude,
"longitude": instance.longitude,
"latitude": extract_latitude(instance),
"longitude": extract_longitude(instance),
}
except Location.DoesNotExist:
return {key: "N/A" for key in loc_fields}
Expand Down Expand Up @@ -486,8 +508,8 @@ def get_location(self, record: FieldMonitoringDataCollectionActivityoverallfindi
"admin_level": instance.admin_level,
"source_id": instance.source_id,
"location_type": instance.admin_level_name,
"latitude": instance.latitude,
"longitude": instance.longitude,
"latitude": extract_latitude(instance),
"longitude": extract_longitude(instance),
}
except Location.DoesNotExist:
return {key: "N/A" for key in loc_fields}
Expand Down
14 changes: 13 additions & 1 deletion src/etools_datamart/celery.py
Original file line number Diff line number Diff line change
Expand Up @@ -18,8 +18,20 @@
def handle_task_failure(
sender=None, task_id=None, exception=None, args=None, kwargs=None, traceback=None, einfo=None, **kw
):
failure_details = f"#Task: {task_id} failed \n Exception:{exception} \n Error info:|{einfo} "
kw_str = ""
if kw:
kw_str = "|".join(f"{key}={value}" for key, value in kw.items())

failure_details = (
f"# Task: {task_id} with args: {args}\n"
f"# Exception: {exception}\n"
f"# Error info: |{einfo}\n"
f"# Traceback:\n{traceback}\n"
f"# kw: {kw_str}"
)

logger.error(f"Failure: {failure_details}")

with sentry_sdk.push_scope() as scope:
scope.set_extra("celery_task_failure_details", failure_details)
sentry_sdk.capture_exception(exception)
Expand Down
18 changes: 12 additions & 6 deletions src/unicef_rest_framework/admin/export.py
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
from datetime import timedelta
from datetime import datetime, timedelta

from django.contrib import admin, messages
from django.contrib.admin import ModelAdmin, register
Expand Down Expand Up @@ -160,11 +160,17 @@ def display_access_history(self, obj):
user = entry.get("u")
timestamp_utc = entry.get("t")
if timestamp_utc:
timestamp_local = timezone.localtime(timezone.datetime.fromtimestamp(timestamp_utc))
formatted_timestamp = timestamp_local.strftime("%Y-%m-%d %H:%M:%S")
formatted_history.append(f"{user}: {formatted_timestamp}")

return ",".join(formatted_history)
try:
# Parse the ISO 8601 string to a datetime object
timestamp_utc = datetime.fromisoformat(timestamp_utc)
timestamp_local = timezone.localtime(timezone.make_aware(timestamp_utc))
formatted_timestamp = timestamp_local.strftime("%Y-%m-%d %H:%M:%S")
formatted_history.append(f"{user}: {formatted_timestamp}")
except ValueError:
# Handle cases where the timestamp format is incorrect
formatted_history.append(f"{user}: Invalid timestamp")

return ", ".join(formatted_history)

display_access_history.short_description = "Export Access History"
list_display = ("export", "display_access_history")
Expand Down
5 changes: 1 addition & 4 deletions src/unicef_rest_framework/models/export.py
Original file line number Diff line number Diff line change
Expand Up @@ -142,17 +142,14 @@ class ExportAccessLog(models.Model):
@classmethod
def log_access(cls, export, username):
import datetime
import json

utc_now = datetime.datetime.utcnow()
timestamp = utc_now.isoformat()
log_entry = {"u": f"{username}", "t": f"{timestamp}"}

try:
access_log = cls.objects.get(export=export)
export_access_data = json.loads(access_log.access_history)
export_access_data.append(log_entry)
access_log.access_history = json.dumps(export_access_data)
access_log.access_history.append(log_entry)
access_log.save()
except cls.DoesNotExist:
cls.objects.create(export=export, access_history=[log_entry])