Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

POC: Use park_cursor to leave cursor at document end #31

Draft
wants to merge 1 commit into
base: master
Choose a base branch
from

Conversation

smheidrich
Copy link
Contributor

Proof of concept to illustrate how to use RustTokenizer's proposed park_cursor method1 to avoid the "overconsumption" issue from #30 / smheidrich/py-json-stream-rs-tokenizer#47. Feel free to edit or open a new PR altogether with better ideas, this is really just to show how to use it.

There is probably a more elegant solution that doesn't involve the introduction of the level param everywhere just to know when the top-level document ends, but I don't grok the code well enough come up with one. E.g. I initially thought it would be possible to just alter load like

def load(fp, ..., tokenizer=...):
    token_stream = tokenizer(fp)
    ...
    base = StreamingJSONBase.factory(token, token_stream, persistent)
    for thing in base:
        yield thing
    if getattr(token_stream, "park_cursor", None):
        token_stream.park_cursor()

which is similar to how you did it in the minimal example in smheidrich/py-json-stream-rs-tokenizer#47, but I guess that wouldn't work because people could no longer call e.g. persistent on the immediate result of StreamJSONBase.factory...

In any case, this requires smheidrich/py-json-stream-rs-tokenizer#50 to be merged to actually work but I think it makes sense to only merge that once everything has been decided on on this end.


1 Still open to better names or turning it into a context manager __exit__ if that makes sense.

Proof of concept only. Maybe there is a more elegant solution that
doesn't require putting `level` everywhere to find out when the
top-level document ends.

Requires
smheidrich/py-json-stream-rs-tokenizer#50
to be merged to actually work.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant