Skip to content

Commit

Permalink
clarify custom JVM vs built-in language in docs.
Browse files Browse the repository at this point in the history
  • Loading branch information
mjohns-databricks committed May 14, 2024
1 parent f4245d1 commit 49a7366
Show file tree
Hide file tree
Showing 3 changed files with 10 additions and 6 deletions.
4 changes: 2 additions & 2 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -74,10 +74,10 @@ As of Mosaic 0.4.0 / DBR 13.3 LTS (subject to change in follow-on releases)...

__Additional Notes:__

As of Mosaic 0.4.0 (subject to change in follow-on releases)...
Mosaic is a custom JVM library that extends spark, which has the following implications in DBR 13.3 LTS:

1. [Unity Catalog](https://www.databricks.com/product/unity-catalog): Enforces process isolation which is difficult to accomplish with custom JVM libraries; as such only built-in (aka platform provided) JVM APIs can be invoked from other supported languages in Shared Access Clusters.
2. [Volumes](https://docs.databricks.com/en/connect/unity-catalog/volumes.html): Along the same principle of isolation, clusters (both assigned and shared access) can read Volumes via relevant built-in readers and writers or via custom python calls which do not involve any custom JVM code.
2. [Volumes](https://docs.databricks.com/en/connect/unity-catalog/volumes.html): Along the same principle of isolation, clusters can read Volumes via relevant built-in (aka platform provided) readers and writers or via custom python calls which do not involve any custom JVM code.

### Mosaic 0.3.x Series

Expand Down
6 changes: 4 additions & 2 deletions docs/source/index.rst
Original file line number Diff line number Diff line change
Expand Up @@ -81,11 +81,13 @@ As of Mosaic 0.4.0 / DBR 13.3 LTS (subject to change in follow-on releases):
API changes, more `here <https://docs.databricks.com/en/udf/index.html>`_.

.. note::
Mosaic is a custom JVM library that extends spark, which has the following implications in DBR 13.3 LTS:

* `Unity Catalog <https://www.databricks.com/product/unity-catalog>`_ enforces process isolation which is difficult
to accomplish with custom JVM libraries; as such only built-in (aka platform provided) JVM APIs can be invoked from
other supported languages in Shared Access Clusters.
* Clusters (both Assigned and Shared Access) can read `Volumes <https://docs.databricks.com/en/connect/unity-catalog/volumes.html>`_
via relevant built-in readers and writers or via custom python calls which do not involve any custom JVM code.
* Clusters can read `Volumes <https://docs.databricks.com/en/connect/unity-catalog/volumes.html>`_ via relevant
built-in (aka platform provided) readers and writers or via custom python calls which do not involve any custom JVM code.


Version 0.3.x Series
Expand Down
6 changes: 4 additions & 2 deletions docs/source/usage/installation.rst
Original file line number Diff line number Diff line change
Expand Up @@ -34,11 +34,13 @@ As of Mosaic 0.4.0 / DBR 13.3 LTS (subject to change in follow-on releases):
API changes, more `here <https://docs.databricks.com/en/udf/index.html>`_.

.. note::
Mosaic is a custom JVM library that extends spark, which has the following implications in DBR 13.3 LTS:

* `Unity Catalog <https://www.databricks.com/product/unity-catalog>`_ enforces process isolation which is difficult
to accomplish with custom JVM libraries; as such only built-in (aka platform provided) JVM APIs can be invoked from
other supported languages in Shared Access Clusters.
* Clusters (both Assigned and Shared Access) can read `Volumes <https://docs.databricks.com/en/connect/unity-catalog/volumes.html>`_
via relevant built-in readers and writers or via custom python calls which do not involve any custom JVM code.
* Clusters can read `Volumes <https://docs.databricks.com/en/connect/unity-catalog/volumes.html>`_ via relevant
built-in (aka platform provided) readers and writers or via custom python calls which do not involve any custom JVM code.

If you have cluster creation permissions in your Databricks
workspace, you can create a cluster using the instructions
Expand Down

0 comments on commit 49a7366

Please sign in to comment.