Skip to content

Commit

Permalink
[SPARK-45026][CONNECT][FOLLOW-UP] Code cleanup
Browse files Browse the repository at this point in the history
### What changes were proposed in this pull request?
move 3 variables to `isCommand` branch

### Why are the changes needed?
they are not used in other branches

### Does this PR introduce _any_ user-facing change?
No

### How was this patch tested?
CI

### Was this patch authored or co-authored using generative AI tooling?
NO

Closes #42765 from zhengruifeng/SPARK-45026-followup.

Authored-by: Ruifeng Zheng <[email protected]>
Signed-off-by: Ruifeng Zheng <[email protected]>
  • Loading branch information
zhengruifeng committed Sep 2, 2023
1 parent e9962e8 commit f0fb434
Showing 1 changed file with 5 additions and 5 deletions.
Original file line number Diff line number Diff line change
Expand Up @@ -2464,15 +2464,15 @@ class SparkConnectPlanner(val sessionHolder: SessionHolder) extends Logging {
case _ => Seq.empty
}

// Convert the results to Arrow.
val schema = df.schema
val maxBatchSize = (SparkEnv.get.conf.get(CONNECT_GRPC_ARROW_MAX_BATCH_SIZE) * 0.7).toLong
val timeZoneId = session.sessionState.conf.sessionLocalTimeZone

// To avoid explicit handling of the result on the client, we build the expected input
// of the relation on the server. The client has to simply forward the result.
val result = SqlCommandResult.newBuilder()
if (isCommand) {
// Convert the results to Arrow.
val schema = df.schema
val maxBatchSize = (SparkEnv.get.conf.get(CONNECT_GRPC_ARROW_MAX_BATCH_SIZE) * 0.7).toLong
val timeZoneId = session.sessionState.conf.sessionLocalTimeZone

// Convert the data.
val bytes = if (rows.isEmpty) {
ArrowConverters.createEmptyArrowBatch(
Expand Down

0 comments on commit f0fb434

Please sign in to comment.