Skip to content

Commit

Permalink
Update docs and improve formatting
Browse files Browse the repository at this point in the history
  • Loading branch information
AndrewSazonov committed Feb 3, 2025
1 parent eb6f02d commit eeb3d22
Show file tree
Hide file tree
Showing 23 changed files with 529 additions and 275 deletions.
32 changes: 16 additions & 16 deletions DEVELOPMENT.md
Original file line number Diff line number Diff line change
Expand Up @@ -27,29 +27,30 @@ This is an example of a workflow that describes the development process.
```console
python -m pip install --upgrade pip
```
- Install easydiffraction from root with `dev` extras for development,
`charts` extras for Jupyter notebooks and `docs` extras for building
documentation
- Install easydiffraction from root with `dev` extras for development, `charts`
extras for Jupyter notebooks and `docs` extras for building documentation
```console
pip install '.[dev,charts,docs]'
```
- Make changes in the code
```console
...
```
- Check the validity of pyproject.toml
```console
validate-pyproject pyproject.toml
```
- Run Ruff - Python linter and code formatter (configuration is in
pyproject.toml)<br/>
Linting (overwriting files)
pyproject.toml)<br/> Linting (overwriting files)
```console
ruff check . --fix
```
Formatting (overwriting files)
```console
ruff format .
```
- Install and run Prettier - code formatter for Markdown, YAML, TOML,
etc. files (configuration in prettierrc.toml)<br/>
Formatting (overwriting files)
- Install and run Prettier - code formatter for Markdown, YAML, TOML, etc. files
(configuration in prettierrc.toml)<br/> Formatting (overwriting files)
```console
npm install prettier prettier-plugin-toml --save-dev --save-exact
npx prettier . --write --config=prettierrc.toml
Expand All @@ -58,9 +59,8 @@ This is an example of a workflow that describes the development process.
```console
pytest tests/ --color=yes -n auto
```
- Clear all Jupyter notebooks output (Only those that were changed!).
Replace `examples/*.ipynb` with the path to the notebook(s) you want
to clear
- Clear all Jupyter notebooks output (Only those that were changed!). Replace
`examples/*.ipynb` with the path to the notebook(s) you want to clear
```console
jupyter nbconvert --clear-output --inplace examples/*.ipynb
```
Expand All @@ -72,12 +72,12 @@ This is an example of a workflow that describes the development process.
```console
pytest --nbmake examples/ --ignore-glob='examples/*emcee*' --nbmake-timeout=300 --color=yes -n=auto
```
- Add extra files to build documentation (from `../assets-docs/` and
- Add extra files to build documentation (from `../assets-docs/` and
`../assets-branding/` directories)
```console
cp -R ../assets-docs/docs/assets/ docs/assets/
cp -R ../assets-docs/includes/ includes/
cp -R ../assets-docs/overrides/ overrides/
cp -R ../assets-docs/overrides/ overrides/
mkdir -p docs/assets/images/
cp ../assets-branding/EasyDiffraction/logos/edl-logo_dark.svg docs/assets/images/logo_dark.svg
cp ../assets-branding/EasyDiffraction/logos/edl-logo_light.svg docs/assets/images/logo_light.svg
Expand All @@ -89,14 +89,14 @@ This is an example of a workflow that describes the development process.
cp ../assets-docs/mkdocs.yml mkdocs.yml
echo "" >> mkdocs.yml
cat docs/mkdocs.yml >> mkdocs.yml
```
```
- Build documentation with MkDocs - static site generator
```console
export JUPYTER_PLATFORM_DIRS=1
mkdocs serve
```
- Test the documentation locally (built in the `site/` directory). E.g.,
on macOS, open the site in the default browser via the terminal
- Test the documentation locally (built in the `site/` directory). E.g., on
macOS, open the site in the default browser via the terminal
```console
open http://127.0.0.1:8000
```
Expand Down
54 changes: 43 additions & 11 deletions docs/analysis.md
Original file line number Diff line number Diff line change
@@ -1,14 +1,26 @@
# Analysis

This section contains information about the analysis of diffraction data in EasyDiffraction.
This section contains information about the analysis of diffraction data in
EasyDiffraction.

### Model-dependent analysis

There are two general approaches to the analysis of data: **model-dependent** and **model-independent**. In the following examples, we are going to focus on the former. However, the latter is worth briefly highlighting.
There are two general approaches to the analysis of data: **model-dependent**
and **model-independent**. In the following examples, we are going to focus on
the former. However, the latter is worth briefly highlighting.

A model-independent approach to analysis is where no assumptions are made about the system that is being studied and conclusions are drawn only from the data that has been observed. However, in many applications, it is desirable to include what we think we know about the system, and so model-dependent analysis is used.
A model-independent approach to analysis is where no assumptions are made about
the system that is being studied and conclusions are drawn only from the data
that has been observed. However, in many applications, it is desirable to
include what we think we know about the system, and so model-dependent analysis
is used.

Model-dependent analysis involves the development of a mathematical model that describes the model dataset that would be found for our system. This mathematical model usually has parameters that are linked to the physics and chemistry of our system. These parameters are varied to optimise the model, using an optimisation algorithm, with respect to the experimental data, i.e., to get the best agreement between the model data and the experimental data.
Model-dependent analysis involves the development of a mathematical model that
describes the model dataset that would be found for our system. This
mathematical model usually has parameters that are linked to the physics and
chemistry of our system. These parameters are varied to optimise the model,
using an optimisation algorithm, with respect to the experimental data, i.e., to
get the best agreement between the model data and the experimental data.

Below is a diagram illustrating this process:

Expand All @@ -26,20 +38,40 @@ flowchart LR
d-- Threshold<br/>reached -->e
```

Model-dependent analysis is popular in the analysis of neutron scattering data, and we will use it in the following examples.
Model-dependent analysis is popular in the analysis of neutron scattering data,
and we will use it in the following examples.

## Calculation engines

EasyDiffraction is designed to be a flexible and extensible tool for calculating diffraction patterns. It can use different calculation engines to perform the calculations.
EasyDiffraction is designed to be a flexible and extensible tool for calculating
diffraction patterns. It can use different calculation engines to perform the
calculations.

We currently rely on [CrysPy](https://www.cryspy.fr) as a calculation engine. CrysPy is a Python library originally developed for analysing polarised neutron diffraction data. It is now evolving into a more general purpose library and covers powders and single crystals, nuclear and (commensurate) magnetic structures, unpolarised neutron and X-ray diffraction.
We currently rely on [CrysPy](https://www.cryspy.fr) as a calculation engine.
CrysPy is a Python library originally developed for analysing polarised neutron
diffraction data. It is now evolving into a more general purpose library and
covers powders and single crystals, nuclear and (commensurate) magnetic
structures, unpolarised neutron and X-ray diffraction.

Another calculation engine is [CrysFML](https://code.ill.fr/scientific-software/CrysFML2008). This library is a collection of Fortran modules for crystallographic computations. It is used in the software package [FullProf](https://www.ill.eu/sites/fullprof/), and we are currently working on its integration into EasyDiffraction.
Another calculation engine is
[CrysFML](https://code.ill.fr/scientific-software/CrysFML2008). This library is
a collection of Fortran modules for crystallographic computations. It is used in
the software package [FullProf](https://www.ill.eu/sites/fullprof/), and we are
currently working on its integration into EasyDiffraction.

## Minimisation engines

EasyDiffraction uses different third-party libraries to perform the model-dependent analysis.
EasyDiffraction uses different third-party libraries to perform the
model-dependent analysis.

Most of the examples in this section will use the [lmfit](https://lmfit.github.io/lmfit-py/) package, which provides a high-level interface to non-linear optimisation and curve fitting problems for Python. It is one of the tools that can be used to fit models to the experimental data.
Most of the examples in this section will use the
[lmfit](https://lmfit.github.io/lmfit-py/) package, which provides a high-level
interface to non-linear optimisation and curve fitting problems for Python. It
is one of the tools that can be used to fit models to the experimental data.

Another package that can be used for the same purpose is [bumps](https://bumps.readthedocs.io/en/latest/). In addition to traditional optimizers which search for the best minimum they can find in the search space, bumps provides Bayesian uncertainty analysis which explores all viable minima and finds confidence intervals on the parameters based on uncertainty in the measured values.
Another package that can be used for the same purpose is
[bumps](https://bumps.readthedocs.io/en/latest/). In addition to traditional
optimizers which search for the best minimum they can find in the search space,
bumps provides Bayesian uncertainty analysis which explores all viable minima
and finds confidence intervals on the parameters based on uncertainty in the
measured values.
Loading

0 comments on commit eeb3d22

Please sign in to comment.