Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Spark contexts sometimes die and don't come back #144

Open
ehiggs opened this issue Jan 7, 2016 · 2 comments
Open

Spark contexts sometimes die and don't come back #144

ehiggs opened this issue Jan 7, 2016 · 2 comments

Comments

@ehiggs
Copy link
Contributor

ehiggs commented Jan 7, 2016

Spark contexts in the ipython notebook sometimes die. These can be restarted by closing the notebook and killing the kernel and then reopening the notebook. Sometimes, however it doesn't work. This might go away w/ spark 1.6.

@boegel
Copy link
Member

boegel commented May 18, 2016

Spark 1.6.1 has a bunch of bug fixes, maybe it could remedy this problem... http://spark.apache.org/releases/spark-release-1-6-1.html

@boegel
Copy link
Member

boegel commented May 20, 2016

Some more information I pulled out of @ehiggs:

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants