Skip to content

Extract data from many databases of Labor, Invalids and Social Affairs sectors and convert to appropriate structure and format, then upload to shared data warehouse and data mart. Thanks to that, people of state agencies can easily retrieve and analyze data based on the compiled data warehouse.

Notifications You must be signed in to change notification settings

Narius2030/MOLISA-Data-Warehouse-Integration

Repository files navigation

Integration Strategy

image

Description: this data warehouse was designed follow Inmon approach that integrated all of data into a single warehouse and it created several data marts associating sectors in government system

  • Data Source: Multi-databases from different systems in governmental sector
  • Medallion Architecture: Refining data across layers that has the goal of improving the structure and quality of data for better insights and analysis - bronze -> silver -> gold
  • Staging Area: Ensuring independence between source database and data warehouse when performing transformations and aggergrates

Data Pipline Automation

All of the step in this project was design to a data pipeline which can be automated to load raw data from source that then go in medallion procedure for ensuring the quality of information. Finally, it was passed into warehouse and data marts.

  • Scheduler: leveraging Apache Airflow to automate end-to-end integration process
  • Transformation: using Apache Spark engine which was Pyspark package in Python to process and aggregate information
  • Environment: this process was deployed on Docker containers including Database Server and Airflow

Docker setup

Dockerfile for Airflow and Spark

FROM apache/airflow:2.9.1-python3.11

USER root

# Install OpenJDK-17
RUN apt update && \
    apt-get install -y openjdk-17-jdk && \
    apt-get install -y ant && \
    apt-get clean;

# Set JAVA_HOME
ENV JAVA_HOME /usr/lib/jvm/java-17-openjdk-amd64/
RUN export JAVA_HOME

USER airflow

# Sync files from local to Docker image
COPY ./airflow/dags /opt/airflow/dags
COPY requirements.txt .

# Pyspark package
RUN pip install --no-cache-dir -r requirements.txt
RUN rm requirements.txt

DAGs of data warehouse integration

image

DAGs of Resident data mart integration

image

DAGs of Time and Location integration

image

About

Extract data from many databases of Labor, Invalids and Social Affairs sectors and convert to appropriate structure and format, then upload to shared data warehouse and data mart. Thanks to that, people of state agencies can easily retrieve and analyze data based on the compiled data warehouse.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published