Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

36 Automate PVC creation? #180

Merged
merged 1 commit into from
Oct 23, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
26 changes: 26 additions & 0 deletions bootstrap/ic-user-projects/create-projects-and-resources-job.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -532,6 +532,32 @@ spec:
restartPolicy: Never
EOF

# Create the Data Pipeline PVC
cat << EOF | oc apply -f-
kind: PersistentVolumeClaim
apiVersion: v1
metadata:
annotations:
openshift.io/description: ''
openshift.io/display-name: Data Pipeline
volume.beta.kubernetes.io/storage-provisioner: openshift-storage.cephfs.csi.ceph.com
volume.kubernetes.io/storage-provisioner: openshift-storage.cephfs.csi.ceph.com
name: processing-pipeline-storage
namespace: $USER_PROJECT
finalizers:
- kubernetes.io/pvc-protection
labels:
opendatahub.io/dashboard: 'true'
spec:
accessModes:
- ReadWriteMany
resources:
requests:
storage: 1Gi
storageClassName: ocs-storagecluster-cephfs
volumeMode: Filesystem
EOF

sleep 20

done
Expand Down
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
52 changes: 5 additions & 47 deletions content/modules/ROOT/pages/05-05-process-claims.adoc
Original file line number Diff line number Diff line change
Expand Up @@ -35,58 +35,16 @@ Here are the main files of the pipeline and what they do:

NOTE: In the folder, we still have for reference an Elyra version of the pipeline (`process_claims.pipeline`), but you cannot really use it from VSCode, which is the environment you should still be in.

== Create a new Persistent Volume Claim (PVC)
== Pipeline artefacts storage

Before we can run the pipeline, we need to create a PVC that will be used to store intermediary files and results in. +
Before we can run the pipeline, we need a place to store intermediary files and results in. +

* Go to the {ocp-short} Console
* Make sure and change your view from **Developer** to **Administrator**
+
[.bordershadow]
image::05/05-switch-to-admin-view.jpg[]
This storage has already been created for you. +

* Under the Administrator view, navigate to **Storage** -> **PersistentVolumeClaims**
+
[.bordershadow]
image::05/05-PVC.png[go to PVC]
You can verify that the Cluster Storage `Data Pipeline` is available in the {rhoai} Dashboard, in the **Cluster Storage** section of your project. +

* Make sure you are in the right project (your username) and then press **Create PersistentVolumeClaim**
+
[.bordershadow]
image::05/05-create-pvc-button.jpg[Create PVC]

* Use these settings:
** StorageClass:
[.lines_space]
[.console-input]
[source, text]
[subs=attributes+]
ocs-storagecluster-cephfs
** PersistentVolumeClaim name:
[.lines_space]
[.console-input]
[source, text]
[subs=attributes+]
processing-pipeline-storage
** Access mode:
[.lines_space]
[.console-input]
[source, text]
[subs=attributes+]
Shared access (RWX)
** Size:
[.lines_space]
[.console-input]
[source, text]
[subs=attributes+]
1 GiB

* it should look like:
+
[.bordershadow]
image::05/05-PVC-settings.png[PVC settings]

* Then press **Create**
image::05/05-data-pipeline-storage.png[]

== Import the pipeline

Expand Down
Loading