Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ScalaBuff generated artifacts not usable from Spark? #100

Open
gmcelhanon opened this issue Dec 18, 2014 · 2 comments
Open

ScalaBuff generated artifacts not usable from Spark? #100

gmcelhanon opened this issue Dec 18, 2014 · 2 comments

Comments

@gmcelhanon
Copy link

I am attempting to add protobuf support to my case classes in Spark. When I use a plain old case class for the values being constructed in my Spark logic, they have values. When I use the equivalent generated ScalaBuff case class / companion object constructed explicitly inline with "dummy" values, the data is present and serializes correctly. However, when attempting to project into the generated classes directly (e.g. using map or mapValues) for my output in Spark, the data comes out empty at execution time (all values are lost and nothing serializes).

Are you aware of any issues around this?

@SandroGrzicic
Copy link
Owner

Hi,

can you provide a minimal example? I'm not aware of any issues around this.

Thanks.

@gmcelhanon
Copy link
Author

Thanks for the quick reply. I will look into this further as I have time over the next couple of weeks. It's entirely possible this is "user error".

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants