diff --git a/bootstrap/ic-user-projects/create-projects-and-resources-job.yaml b/bootstrap/ic-user-projects/create-projects-and-resources-job.yaml index 8d3ac75c..43837c44 100644 --- a/bootstrap/ic-user-projects/create-projects-and-resources-job.yaml +++ b/bootstrap/ic-user-projects/create-projects-and-resources-job.yaml @@ -532,6 +532,32 @@ spec: restartPolicy: Never EOF + # Create the Data Pipeline PVC + cat << EOF | oc apply -f- + kind: PersistentVolumeClaim + apiVersion: v1 + metadata: + annotations: + openshift.io/description: '' + openshift.io/display-name: Data Pipeline + volume.beta.kubernetes.io/storage-provisioner: openshift-storage.cephfs.csi.ceph.com + volume.kubernetes.io/storage-provisioner: openshift-storage.cephfs.csi.ceph.com + name: processing-pipeline-storage + namespace: $USER_PROJECT + finalizers: + - kubernetes.io/pvc-protection + labels: + opendatahub.io/dashboard: 'true' + spec: + accessModes: + - ReadWriteMany + resources: + requests: + storage: 1Gi + storageClassName: ocs-storagecluster-cephfs + volumeMode: Filesystem + EOF + sleep 20 done diff --git a/content/modules/ROOT/assets/images/05/05-data-pipeline-storage.png b/content/modules/ROOT/assets/images/05/05-data-pipeline-storage.png new file mode 100644 index 00000000..145970d7 Binary files /dev/null and b/content/modules/ROOT/assets/images/05/05-data-pipeline-storage.png differ diff --git a/content/modules/ROOT/pages/05-05-process-claims.adoc b/content/modules/ROOT/pages/05-05-process-claims.adoc index 143ced87..d3d8c61b 100644 --- a/content/modules/ROOT/pages/05-05-process-claims.adoc +++ b/content/modules/ROOT/pages/05-05-process-claims.adoc @@ -35,58 +35,16 @@ Here are the main files of the pipeline and what they do: NOTE: In the folder, we still have for reference an Elyra version of the pipeline (`process_claims.pipeline`), but you cannot really use it from VSCode, which is the environment you should still be in. -== Create a new Persistent Volume Claim (PVC) +== Pipeline artefacts storage -Before we can run the pipeline, we need to create a PVC that will be used to store intermediary files and results in. + +Before we can run the pipeline, we need a place to store intermediary files and results in. + -* Go to the {ocp-short} Console -* Make sure and change your view from **Developer** to **Administrator** -+ -[.bordershadow] -image::05/05-switch-to-admin-view.jpg[] +This storage has already been created for you. + -* Under the Administrator view, navigate to **Storage** -> **PersistentVolumeClaims** -+ -[.bordershadow] -image::05/05-PVC.png[go to PVC] +You can verify that the Cluster Storage `Data Pipeline` is available in the {rhoai} Dashboard, in the **Cluster Storage** section of your project. + -* Make sure you are in the right project (your username) and then press **Create PersistentVolumeClaim** -+ [.bordershadow] -image::05/05-create-pvc-button.jpg[Create PVC] - -* Use these settings: -** StorageClass: -[.lines_space] -[.console-input] -[source, text] -[subs=attributes+] -ocs-storagecluster-cephfs -** PersistentVolumeClaim name: -[.lines_space] -[.console-input] -[source, text] -[subs=attributes+] -processing-pipeline-storage -** Access mode: -[.lines_space] -[.console-input] -[source, text] -[subs=attributes+] -Shared access (RWX) -** Size: -[.lines_space] -[.console-input] -[source, text] -[subs=attributes+] -1 GiB - -* it should look like: -+ -[.bordershadow] -image::05/05-PVC-settings.png[PVC settings] - -* Then press **Create** +image::05/05-data-pipeline-storage.png[] == Import the pipeline