Skip to content

Latest commit

 

History

History
86 lines (64 loc) · 3.08 KB

README.md

File metadata and controls

86 lines (64 loc) · 3.08 KB

Spark development

Documentation hub for Spark development.

Note This document is intended for people interested in helping the Spark development effort. If you just want to use Spark or integrate it with your system, refer to the Spark usage guide.

TL;DR / Quickstart

Optionally, if you are familiar with Docker / Docker Compose check out Using Spark with Docker

If you want to update the documentation, have a look here:

If you want to contribute code you should review the following documents as well:

Additional documents:

Localization

We use the i18next internationalization library for multi-language support. The strings are stored in the /locales folder. The file name is the language code.

Tips:
  1. Try to make the Hebrew text gender neutral. If you need to differentiate between genders, use the _male or _female suffixes.
  2. Use variables in the text if needed, don't concatenate strings.
  3. i18next supports many string operations including formatting, single/plural forms and more. Take a look at i18next documentation.

Templates

We use Jade template engine, a language that compiles to HTML, to seperate logic from markup. No more angle brackets!

Read more about Jade Syntax Documentation

Greate Jade to HTML converter Here

i18n & templates

You can use i18n in Jade templates. To set an HTML element with a translatable data the general syntax is:

HTML_ELEMENT=t('KEY')

Example:

h1=t('welcome_spark')

Text injection inside attributes:

#{t('KEY')}

Example:

input( data-error="#{t('bad_email')}" )

Date formatting in templates

You can use moment.js function inside jade template like this:

p #{moment(created_at).format('DD/MM/YYYY, HH:mm')}

where created_at is jade variable containing date string.

If you want to do date formatting on angular variable, just use angular.js date pipe like this:

td {{camp.updated_at | date: "dd/mm/yyyy, HH:mm" }}

Email

Spark emails by default are not being sent. If you wish the emails from Spark to go through, set the enable property in your local config file.

  "mail": {
	"enabled": true,
  },