Have you ever had to explain a suspiciously large 'business lunch', wonder where all your money went, or chase down friends for their share of last night's dinner? With ExpenseFlow, budget leaks become a thing of the past and you'll never have to play detective with your finances again.
ExpenseFlow is a comprehensive expense management tool aimed at both individuals and businesses. It simplifies financial tracking and reporting through an intuitive interface. With automated document scanning and auto-generated reports, ExpenseFlow empowers users to gain financial clarity and control. It offers easy budget management and spending analysis for individuals, making expense management effortless and transparent.
This project has been completed solely by members of Team ExpenseFlow.
Below is the project structure for ExpenseFlow:
./
├── api/ # Backend, built in python using FastAPI
├── assets/ # Various project-related assets
├── infra/ # This is where all the IaC lives (Terraform)
├── model/ # Model artefacts (e.g. Structurizr DSL or PUML files)
├── report/ # Project report
├── ui/ # Frontend, with Flutter
├── ... # Other top-level files
The quickest and easiest way to get the project running on your local machine is to use Docker Compose. This will let you build and run the backend, frontend and database containers locally.
Note
The local.sh
and kill_local.sh
all require Docker Compose to be installed on your machine.
Before starting the app with ./local.sh
you must have Docker compose installed and must have a .env
file in the project root directory, with the following values:
AUTH0_DOMAIN=... # Domain name of your Auth0 Tenant (Auth0 is used for authentication in the project).
AUTH0_CLIENT_ID=... # Client ID of your Auth0 Client Application
JWT_AUDIENCE=... # Identifier of your Auth0 API Application (Resource Server)
Important
You must also have an Auth0 tenant setup to run this project. This can be deployed using Terraform or if you have access to the repo settings, you will be able to find our Auth0 secrets in Settings > Secrets and Variables > Actions
Then run ./local.sh
. This will make the API and UI available at the following ports:
- Web UI ->
localhost:3000
- API ->
localhost:8080
(The base path/
will redirect you to the docs/docs
)
Once you are finished with the application, run ./kill_local.sh
to stop everything.
Note
Our deployment is done using Terraform, so you will need it installed. Official Terraform Install Guide
Tip
You can skip this process by running the Infrastructure Deployment workflow with your aws credentials as input.
To deploy to AWS, put your aws credentials in a file called credentials
in the root directory (file name must be exact).
[!NOTE] Your AWS credentials must be in the following format:
[profile_name]
aws_access_key_id=...
aws_secret_access_key=...
aws_session_token=...
You must also have a terraform.tfvars
file in the /infra
directory to contain the following values:
db_password = ...
auth0_domain = ... # Same value as local deployment
auth0_client_id = ... # Same value as local deployment
auth0_client_secret = ... # Same value as local deployment
sentry_dsn = "https://f0e2babc247dfbc9bef0b233664acab0@o4509370795032576.ingest.us.sentry.io/4509370811219968"
Once complete, run ./infra/deploy-infra.sh [--auto]
, with --auto
being optional if you want to run confirmations automatically.
To remove the infrastructure, run ./infra/teardown.sh [--auto]
, with --auto
being optional if you want to run confirmations automatically.
To run the unit and integration tests on the backend, run:
./test.sh
Note
This requires Docker Compose to be installed and an internet connection to pull the postgres image.
This script uses Docker Compose as a running postgres instance is required to run the Pytest tests. Once complete, this script will also output the code coverage details.
Tip
You can skip this process by running the Backend Tests workflow which will run tests and a static type checking tool on the code.