You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I am attempting to add protobuf support to my case classes in Spark. When I use a plain old case class for the values being constructed in my Spark logic, they have values. When I use the equivalent generated ScalaBuff case class / companion object constructed explicitly inline with "dummy" values, the data is present and serializes correctly. However, when attempting to project into the generated classes directly (e.g. using map or mapValues) for my output in Spark, the data comes out empty at execution time (all values are lost and nothing serializes).
Are you aware of any issues around this?
The text was updated successfully, but these errors were encountered:
I am attempting to add protobuf support to my case classes in Spark. When I use a plain old case class for the values being constructed in my Spark logic, they have values. When I use the equivalent generated ScalaBuff case class / companion object constructed explicitly inline with "dummy" values, the data is present and serializes correctly. However, when attempting to project into the generated classes directly (e.g. using map or mapValues) for my output in Spark, the data comes out empty at execution time (all values are lost and nothing serializes).
Are you aware of any issues around this?
The text was updated successfully, but these errors were encountered: