User Interface for respondents to access ONS Survey Data Collection questionnaires and services
To install all dependencies and download the templates run:
make install
To build the docker image and run the tests:
make build
Once that's done, you can run the make command to run it in your terminal:
make run
or to run it in Pycharm, use the run template that's specified and it should work as expected.
The site uses PyBabel for translations.
Text to translate is marked up in .html
and .py
templates and files with the gettext mechanism. Then a messages.pot
file is build via Pybabel, which collates all the text to translate.
Based on the content of the .pot
file translation files messages.po
are created for all specified language codes. Finally, the .po
files are compiled into .mo
files, which are then used on the page.
- Extract text to translation
To build/re-build the translation messages.pot use:
pipenv run pybabel extract -F babel.cfg -o rh_ui/translations/messages.pot .
- Create new language file
Caution
Follow this step only when you want to add translation for a new language. Running it for an existing language code will overwrite the existing translations.
To create a new language messages file, run the following, changing the 2 character language code at the end to the required language code. Only generate a individual language file once.
pipenv run pybabel init -i rh_ui/translations/messages.pot -d rh_ui/translations -l cy
- Update translation files
Once created, you can update the existing language messages.po files to include changes in the messages.pot by running the following. This will update ALL language files.
pipenv run pybabel update -i rh_ui/translations/messages.pot -d rh_ui/translations
- Compile the translations
Note
Double check your translation files (messages.po) for the #, fuzzy
comment. This indicates that the tranlsation isn't an exact match and won't be rendered on the page. Double check the translations and remove any incorrectly matched lines, then remove the fuzzy
comment.
To compile updates to the messages.po files into messages.mo (the file actually used by the site) use:
pipenv run pybabel compile -d rh_ui/translations
We have a suite of Venom tests for testing the OWASP header recommendations. These tests are copied from the OWASP Secure Headers Project.
These tests are designed to be run against a public HTTPS endpoint in a prod-like environment, since some of the headers they check must not be present on HTTP responses e.g. the HSTS header. Hence, these tests cannot all pass when run against a local HTTP only instance. However, it may be useful for development to still run them locally.
-
First, use our docker dev to run our local development environment to test against.
-
Now create a local copy of the venom test suite with the correct URL substituted in:
sed -e "s|<VENOM_TARGET_URL>|http://host.docker.internal:9092/en/start|" ./tests/venom_tests.yml > tmp_venom_local.yml
It may be helpful to now open the created
tmp_venom_local.yml
file in an editor and comment out theStrict-Transport-Security
test case which is bound to fail locally. -
Now run the tests against the docker dev RH UI using the official OVH Venom docker image ( See https://github.com/ovh/venom?tab=readme-ov-file#docker-image for more details on running venom and docs on configuration).
mkdir -p tmp_venom_results docker run --network="ssdcrmdockerdev_default" \ --mount type=bind,source=$(pwd)/tmp_venom_local.yml,target=/workdir/tests/tests.yml \ --mount type=bind,source=$(pwd)/tmp_venom_results,target=/workdir/results \ ovhcom/venom:latest
-
The test results should be shown in your terminal, and details logs written in the
tmp_venom_results
folder.