Optimised for multiple workers and auto archiving.
Based on creativecoder/piwik-heroku and «Setting up Piwik on Heroku» by Joshua Estes.
We check in the whole Matomo code base since the composer package is broken—at least for 3.8.1 to 3.9.1. If you want to use this and install new plugins you should fork it.
Prerequisite: Heroku CLI
heroku apps:create my-matomo --region eu
heroku buildpacks:add --index 1 https://github.com/danstiner/heroku-buildpack-geoip-geolite2
heroku buildpacks:add --index 2 heroku/php
- Add a
MYSQL_URL
, e.g. AWS RDS - Add a
REDIS_URL
, e.g.heroku addons:create heroku-redis:premium-1
heroku config:set SALT=XXXXXXX TRUSTED_HOST=my-matomo.herokuapp.com MAXMIND_LICENSE_KEY=XXXXXXX
git push heroku
You'll need to obtain a free MaxMind key for GeoIP.
generate.config.ini.php
is always run before starting the app on Heroku. Ensuring the environment changes are always reflected.
Prerequisite: PHP, phpredis, mysql 5.7
brew install php [email protected]
pecl install redis
Run generate.config.ini.php
with inline envs:
REDIS_URL=redis://127.0.0.1:6379 \
MYSQL_URL=mysql://root:@localhost:3306/piwik \
TRUSTED_HOST=localhost:8000 \
SALT=XXXXXXX \
php ./generate.config.ini.php
php -S 0.0.0.0:8000 -t matomo/
Process during tracking request
should be disabled in System -> General Settings # QueuedTracking
. And a scheduler or dedicated worker should be used to process the queue.
This is because Heroku generally has a 30s timeout for requests and by default also for php-fpm script execution time—the processing will be aborted if it exceeds 30s.
Use the «Heroku Scheduler» addon and setup a job to run the following command every 10 minutes with an performance-l dyno:
php ./generate.config.ini.php && php -d memory_limit=14G ./matomo/console queuedtracking:process
heroku run "php ./generate.config.ini.php && ./matomo/console queuedtracking:monitor"
One could also raise the php-fpm execution limit in fpm_custom.conf
:
request_slowlog_timeout = 25s
request_terminate_timeout = 5m
However this will still produce warnings in the logs, high response times and possibly timeouts in the Heroku metrics.
# invalidate all reports via Settings -> System -> Invalidate reports
# run detached to avoid timeout
heroku run:detached --size=performance-l "php ./generate.config.ini.php && php -d memory_limit=14G ./matomo/console core:archive --force-all-websites --php-cli-options=\"-d memory_limit=14G\" --concurrent-requests-per-website=8"
heroku ps # get run number, e.g. 1
# follow logs
heroku logs --dyno run.1 -t
# stop if needed
heroku ps:stop run.1
See Matomo docs for more options.
Use the «Heroku Scheduler» addon and setup a job to run the following command every hour with an performance-l dyno:
php ./generate.config.ini.php && php -d memory_limit=14G ./matomo/console core:archive --force-periods="day,week" --force-date-last-n=2 --php-cli-options="-d memory_limit=14G"
And following command every night at e.g. 00:30 UTC with an performance-l dyno:
php ./generate.config.ini.php && php -d memory_limit=14G ./matomo/console core:archive --php-cli-options="-d memory_limit=14G"
Run it locally and install via the interface.
Afterwards synch the newly added or removed plugins manually to Plugins[]
and PluginsInstalled[]
in generate.config.ini.php
. Commit the file system changes and deploy.
This setup is configured to use the GeoIp2 plugin included in the core Matomo package. The GeoLite databases are downloaded on every deploy using danstiner/heroku-buildpack-geoip-geolite2.
You can turn on this geolocation method on in Settings > System > Geolocation.
Download zip from matomo and merge extract it:
wget https://builds.matomo.org/matomo.zip
unzip -o matomo.zip
rm matomo.zip "How to install Matomo.html"
For plugins you can merge with e.g. ditto
:
ditto ~/Downloads/CustomDimensions ~/Code/matomo/matomo/plugins/CustomDimensions
(We usually prefer to update the plugins through the matomo website locally, then commit/push/deploy those changes to heroku.)
Run locally after update and do a system check:
php -S 0.0.0.0:8000 -t matomo/
open "http://localhost:8000/index.php?module=Installation&action=systemCheckPage"
Do not forget to check for plugins to update!
It will often report files that can be removed after an update. PHP config issues and archiving completion can be ignored locally.
Watch out: big migration may take longer in production than locally.
Before deploying you can put the admin interface into maintenance mode by setting MAINTENANCE_MODE=1
:
# faster env switching
heroku features:disable preboot
heroku config:set MAINTENANCE_MODE=1
git push production
Once the code is deployed you should disable tracking by setting the following env or if you're using the queuedtracking:process
scheduler you can keep tracking but just temporarily remove the process job. You might also want to remove the archive jobs.
heroku config:set DISABLE_TRACKING=1
And then start the migration with a detached one-off dyno:
# run detached to avoid timeout
heroku run:detached --size=performance-l "php ./generate.config.ini.php && php -d memory_limit=14G /app/matomo/console core:update --yes"
heroku ps # get run number, e.g. 1
# follow logs
heroku logs --dyno run.1 -t
# stop if needed
heroku ps:stop run.1
After the migration remove the MAINTENANCE_MODE
and DISABLE_TRACKING
env. For minor migrations you may be able to skip MAINTENANCE_MODE
and DISABLE_TRACKING
.
heroku config:unset MAINTENANCE_MODE DISABLE_TRACKING
# re-enable for seemless deploys without or small migrations
heroku features:enable preboot
If you removed any scheduler jobs: make sure to add them again.