diff --git a/docs/source/api_ref.md b/docs/source/api_ref.md
index 6fb951f..c5c69e8 100644
--- a/docs/source/api_ref.md
+++ b/docs/source/api_ref.md
@@ -81,18 +81,7 @@ This document is for developers and/or advanced users of OSP-core, it contains a
## Utilities
```eval_rst
-.. autoclass:: osp.core.utils.cuds2dot.Cuds2dot
- :members:
-
-.. automodule:: osp.core.utils.general
- :members:
-
-.. automodule:: osp.core.utils.pretty_print
- :members:
-
-.. automodule:: osp.core.utils.simple_search
- :members:
-
-.. automodule:: osp.core.utils.wrapper_development
- :members:
+.. automodule:: osp.core.utils
+ :imported-members:
+ :members:
```
\ No newline at end of file
diff --git a/docs/source/fundamentals.md b/docs/source/fundamentals.md
index 743826e..6d8fb40 100644
--- a/docs/source/fundamentals.md
+++ b/docs/source/fundamentals.md
@@ -139,7 +139,7 @@ CUDS, or Common Universal Data Structure, is the ontology compliant data format
- **CUDS is an ontology individual**: each CUDS object is an instantiation of a class in the ontology.
If we assume a food ontology that describes classes like pizza or pasta, a CUDS object could represent one specific pizza or pasta dish, that exists in the real world.
Similar to ontology individuals, CUDS objects can be related with other individuals/CUDS by relations defined in the ontology. Like a _pizza_ that 'hasPart' _tomato sauce_
-- **CUDS is API**: To allow users to interact with the ontology individuals and their data, CUDS provide a CRUD API.
+- **CUDS is API**: To allow users to interact with the ontology individuals and their data, CUDS provides a CRUD API.
- **CUDS is a container**: Depending on the relationship connecting two CUDS objects, a certain instance can be seen as a container of other instances.
We call a relationship that express containment an 'active relationship'.
In the pizza example, 'hasPart' would be an 'active relationship'. If one would like to share the pizza CUDS object with others, one would like to share also the tomato sauce.
diff --git a/docs/source/index.md b/docs/source/index.md
index 68c2045..392fb97 100644
--- a/docs/source/index.md
+++ b/docs/source/index.md
@@ -81,6 +81,7 @@ SimPhoNy is an ontology-based [open-source](./license.md) Python framework that
jupyter/sessions_and_vars.ipynb
utils.md
jupyter/multiple_wrappers.ipynb
+ jupyter/import_export.ipynb
jupyter/simlammps.ipynb
jupyter/quantum_espresso.ipynb
diff --git a/docs/source/jupyter/import_export.ipynb b/docs/source/jupyter/import_export.ipynb
new file mode 100644
index 0000000..f9795e7
--- /dev/null
+++ b/docs/source/jupyter/import_export.ipynb
@@ -0,0 +1,319 @@
+{
+ "cells": [
+ {
+ "cell_type": "markdown",
+ "id": "boxed-professional",
+ "metadata": {},
+ "source": [
+ "# Tutorial: Import and export"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "id": "operational-honey",
+ "metadata": {},
+ "source": [
+ "\n",
+ "\n",
+ "In this tutorial we will be covering the import and export capabilities of OSP-core. The utility functions that provide these functionalities are `import_cuds` and `export_cuds`, respectively."
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "id": "qualified-works",
+ "metadata": {},
+ "source": [
+ "
\n",
+ "
Tip
\n",
+ " \n",
+ "The full API specifictions of the import and export functions can be found in the\n",
+ "[utilities API reference page](../api_ref.html#osp.core.utils.export_cuds).\n",
+ " \n",
+ "
\n",
+ " \n",
+ "1. The format is automatically inferred from the file extension. To specify it explicitly, you can add the `format` parameter, like so: `import_cuds('./data.ttl', format='turtle')`.\n",
+ "1. The `session` parameter is optional and inferred automatically from the context that created by the `with` statement (see the [tutorial on multiple wrappers](./multiple_wrappers.html) for more information). You can specify the session explicitly like so: `import_cuds('./data.ttl', session=session)`.\n",
+ " \n",
+ "
"
+ ]
+ }
+ ],
+ "metadata": {
+ "kernelspec": {
+ "display_name": "Python 3.7.4 64-bit",
+ "language": "python",
+ "name": "python37464bit7e5bfc198a4544d1be12f13215aed90d"
+ },
+ "language_info": {
+ "codemirror_mode": {
+ "name": "ipython",
+ "version": 3
+ },
+ "file_extension": ".py",
+ "mimetype": "text/x-python",
+ "name": "python",
+ "nbconvert_exporter": "python",
+ "pygments_lexer": "ipython3",
+ "version": "3.8.5"
+ },
+ "metadata": {
+ "interpreter": {
+ "hash": "301cd6007de04cbbf15bca26f0bc1cb48004d089278091d760363de622bdd0c8"
+ }
+ }
+ },
+ "nbformat": 4,
+ "nbformat_minor": 5
+}
diff --git a/docs/source/jupyter/ontology_interface.ipynb b/docs/source/jupyter/ontology_interface.ipynb
index 9529d98..8344f62 100644
--- a/docs/source/jupyter/ontology_interface.ipynb
+++ b/docs/source/jupyter/ontology_interface.ipynb
@@ -201,6 +201,28 @@
"- When the keyword `reference_by_label` is set to `False` (disabled) or not set, the dot notation is a shorthand for fetching by suffix instead. This keyword is **disabled** in the `city` namespace."
]
},
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "To get a list of all the entities available within a namespace, run `list(namespace)`."
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "
\n",
+ "
\n",
+ "
Tip
\n",
+ " \n",
+ "The dot notation supports IPython autocompletion. For example, when working on a Jupyter notebook, once the namespace has been imported, it is possible to get suggestions for the entity names by writing `namespace.` and pressing TAB.\n",
+ "\n",
+ "
\n",
+ "
\n",
+ ""
+ ]
+ },
{
"cell_type": "markdown",
"metadata": {},
diff --git a/docs/source/links.md b/docs/source/links.md
index 76ac9b7..2f76edc 100644
--- a/docs/source/links.md
+++ b/docs/source/links.md
@@ -48,7 +48,8 @@ The following table describes the compatibilities between of SimPhoNy docs and O
============= ==========
SimPhoNy docs OSP-core
============= ==========
- 2.4.x 3.5.2-beta
+ 2.4.1 3.5.3.1-beta
+ 2.4.0 3.5.2-beta
2.3.x 3.4.0-beta
2.2.x 3.3.5-beta
2.1.x 3.3.0-beta
diff --git a/docs/source/ontologies_included.md b/docs/source/ontologies_included.md
index 3474e0d..a9d337d 100644
--- a/docs/source/ontologies_included.md
+++ b/docs/source/ontologies_included.md
@@ -13,7 +13,6 @@ ontology. We used [Ontology2Dot](utils.md#ontology2dot) for that:
![ontology2dot sample image](./_static/img/ontology2dot.png)
-eval_rst
To use the city ontology you have to install it using the tool [Pico](utils.md#pico-installs-cuds-ontologies):
```sh
diff --git a/docs/source/ontology_intro.md b/docs/source/ontology_intro.md
index 5957b20..fe6a55d 100644
--- a/docs/source/ontology_intro.md
+++ b/docs/source/ontology_intro.md
@@ -10,4 +10,4 @@ the representational primitives include information about their meaning
and constraints on their logically consistent application. (Source:
)
-TODO extend
\ No newline at end of file
+[//]: # (TODO Extend)
diff --git a/docs/source/overview.md b/docs/source/overview.md
index 10f4691..5dca740 100644
--- a/docs/source/overview.md
+++ b/docs/source/overview.md
@@ -61,7 +61,7 @@ At this point, the results could be fetched again and for example, visualized wi
Exactly in the same way that the data can be moved between a database and a simulation engine using their respective wrappers, it can also be moved between simulation engines.
-This functionality facilitates the coupling and linking between such simulation engines. For example, in the domain of materials science, a certain engine might be useful for representing structures made up of atomistic particles (molecular dynamics), while another software tool could be focussed on representing bodies of fluids (fluid dynamics). As SimPhoNy can enable communication between the two tools, they could both be run and synced simultaneously to create more complex scenarios.
+This functionality facilitates the coupling and linking between such simulation engines. For example, in the domain of materials science, a certain engine might be useful for representing structures made up of atomistic particles (molecular dynamics), while another software tool could be focussed on representing bodies of fluids (fluid dynamics). As SimPhoNy can enable communication between the two tools, they could both be run and synced simultaneously to create more complex scenarios, such as a multi-scale simulation.