Replies: 1 comment
-
On further tries, the conclusion I am drawing is any statement mentioning a path or accessing a path using Spark DF API is not going through with similar error as per the title.
I came across #5166 (comment) |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Hi,
I am just experimenting on Ranger and Kyuubi Spark Ranger plugin.
I have setup all components on docker namely HMS (standalone), Minio (for s3), and Ranger docker containers (as in https://hub.docker.com/r/apache/ranger)
I have setup a allow policy on dev-hive service in Ranger and mapped to my user-role.
I am then trying to create a table through Spark running in my local mac after placing ranger shaded jar and required config as instructed at https://kyuubi.readthedocs.io/en/v1.7.3/security/authorization/spark/install.html
I am using below to get spark session
Below code is failing:
Below is the exception
I am able to execute the code and create tables without kyuubi (and thus ranger) i.e without
.config("spark.sql.extensions","org.apache.kyuubi.plugin.spark.authz.ranger.RangerSparkExtension")
I also noted that Ranger policy defined for hive is taking effect i.e. table level auth are passing through by spark-kyuubi-ranger trio (tested separately by removing role mapping, which showed create table perms error in addition to s3 write perms)
Essentially, what I am asking is where to define policy for granting permissions to the s3 location and make spark-kyuubi recognise that as per policy.
Versions:
Spark: 3.4.4 local
kyuubi-spark-authz-shaded: 2.12-1.9.0
HMS standalone:3.0.0 (Used this tute)
Ranger: 2.4.0
Beta Was this translation helpful? Give feedback.
All reactions