Skip to content

Commit

Permalink
[KYUUBI #5611][AUTHZ] Authz support path privilege for SaveIntoDataSo…
Browse files Browse the repository at this point in the history
…urceCommand

### _Why are the changes needed?_
To close #5611
Authz support path privilege for SaveIntoDataSourceCommand

### _How was this patch tested?_
- [x] Add some test cases that check the changes thoroughly including negative and positive cases if possible

- [ ] Add screenshots for manual tests if appropriate

- [ ] [Run test](https://kyuubi.readthedocs.io/en/master/contributing/code/testing.html#running-tests) locally before make a pull request

### _Was this patch authored or co-authored using generative AI tooling?_
No

Closes #5613 from AngersZhuuuu/KYUUBI-5611.

Closes #5611

24081b1 [Angerszhuuuu] [KYUUBI #5611][AUTHZ] Authz support path privilege for SaveIntoDataSourceCommand

Authored-by: Angerszhuuuu <[email protected]>
Signed-off-by: Kent Yao <[email protected]>
  • Loading branch information
AngersZhuuuu authored and yaooqinn committed Nov 3, 2023
1 parent c149809 commit 51dd31c
Show file tree
Hide file tree
Showing 5 changed files with 25 additions and 2 deletions.
Original file line number Diff line number Diff line change
Expand Up @@ -16,4 +16,5 @@
#

org.apache.kyuubi.plugin.spark.authz.serde.CatalogStorageFormatURIExtractor
org.apache.kyuubi.plugin.spark.authz.serde.OptionsUriExtractor
org.apache.kyuubi.plugin.spark.authz.serde.StringURIExtractor
Original file line number Diff line number Diff line change
Expand Up @@ -1344,7 +1344,11 @@
"fieldName" : "query",
"fieldExtractor" : "LogicalPlanQueryExtractor"
} ],
"uriDescs" : [ ]
"uriDescs" : [ {
"fieldName" : "options",
"fieldExtractor" : "OptionsUriExtractor",
"isInput" : false
} ]
}, {
"classname" : "org.apache.spark.sql.hive.execution.CreateHiveTableAsSelectCommand",
"tableDescs" : [ {
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -41,3 +41,9 @@ class CatalogStorageFormatURIExtractor extends URIExtractor {
v1.asInstanceOf[CatalogStorageFormat].locationUri.map(uri => Uri(uri.getPath)).toSeq
}
}

class OptionsUriExtractor extends URIExtractor {
override def apply(v1: AnyRef): Seq[Uri] = {
v1.asInstanceOf[Map[String, String]].get("path").map(Uri).toSeq
}
}
Original file line number Diff line number Diff line change
Expand Up @@ -559,7 +559,8 @@ object TableCommands extends CommandSpecs[TableCommandSpec] {
val SaveIntoDataSourceCommand = {
val cmd = "org.apache.spark.sql.execution.datasources.SaveIntoDataSourceCommand"
val queryDesc = queryQueryDesc
TableCommandSpec(cmd, Nil, queryDescs = Seq(queryDesc))
val uriDesc = UriDesc("options", classOf[OptionsUriExtractor])
TableCommandSpec(cmd, Nil, queryDescs = Seq(queryDesc), uriDescs = Seq(uriDesc))
}

val InsertIntoHadoopFsRelationCommand = {
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -1085,4 +1085,15 @@ class HiveCatalogRangerSparkExtensionSuite extends RangerSparkExtensionSuite {
}
}
}
test("SaveIntoDataSourceCommand") {
withTempDir { path =>
withSingleCallEnabled {
val df = sql("SELECT 1 as id, 'Tony' as name")
interceptContains[AccessControlException](doAs(
someone,
df.write.format("console").save(path.toString)))(
s"does not have [select] privilege on [[$path, $path/]]")
}
}
}
}

0 comments on commit 51dd31c

Please sign in to comment.