You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Exploring performance issue with voila I've reached the point where I think we should rethink how we do communication with widgets between kernel and frontend.
Change how we are sending data
We now send data as JSON, but this JSON is sent at the same 'level' as the jupyter protocol, and because that also is JSON, this works. However, at the cost of the jupyter_server decoding and encoding this. This is not only not a good separation of communication layers, it it also quite slow (parsing, checking dates, json_clean and copying dicts and lists).
However, I think all data (e.g. the state), should be sent in an encoded form (so json.dumps(state)`). Even better would be to send it using the comm buffers, so the jupyter_server (e.g. voila) would not even decode and encode the data. We can even choose to use something like bson, but it should at least not be tied to the jupyter protocol format.
I think this will speed things up for the server quite a bit, but also inside the kernel, since it will avoid 'json_clean'.
1 comm to rule them all
Instead of many comms, we can choose to use only 1, so we can batch changes in 1 message. Sth like
withwidgets.hold_sync()
# create 1000 widgets# on context leave, we send all changes in 1 comm message.
This would give a performance boost mostly to the server it seems, because I've observed quite some overhead with sending websocket frames in tornado.
The text was updated successfully, but these errors were encountered:
Exploring performance issue with voila I've reached the point where I think we should rethink how we do communication with widgets between kernel and frontend.
Change how we are sending data
We now send data as JSON, but this JSON is sent at the same 'level' as the jupyter protocol, and because that also is JSON, this works. However, at the cost of the jupyter_server decoding and encoding this. This is not only not a good separation of communication layers, it it also quite slow (parsing, checking dates,
json_clean
and copying dicts and lists).However, I think all data (e.g. the state), should be sent in an encoded form (so json.dumps(state)`). Even better would be to send it using the comm buffers, so the jupyter_server (e.g. voila) would not even decode and encode the data. We can even choose to use something like bson, but it should at least not be tied to the jupyter protocol format.
I think this will speed things up for the server quite a bit, but also inside the kernel, since it will avoid 'json_clean'.
1 comm to rule them all
Instead of many comms, we can choose to use only 1, so we can batch changes in 1 message. Sth like
This would give a performance boost mostly to the server it seems, because I've observed quite some overhead with sending websocket frames in tornado.
The text was updated successfully, but these errors were encountered: