-
Notifications
You must be signed in to change notification settings - Fork 46
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Use textmate grammar instead of pygments #244
Comments
Hi @watermarkhu This looks really interesting. At the moment I started tackling #44 and #222, and the pygments token output is just a mess to start parsing. I'll give a shot a see if it can replace pygments and then improve the functionality. Regarding starting up a new auto-documenter, I can only tell how this domain was started. The original author basically built the documenter directly upon autodoc for Python. This gave them a good start and basis. A different approach for autodoc is done in https://github.com/mozilla/sphinx-js. I hope this helps. |
Good to hear! I'm currently mostly struggling with setting up roles in a new domain in order to make cross-referencing possible eventually. Can we possibly setup a call? |
I tried We can setup a call, but be warned I am by no means an expert in the cross-referencing. You can contact me at jorgen at cederberg dot be. |
@watermarkhu Two comments to https://github.com/watermarkhu/textmate-grammar-python:
Do you want me to add them as issues? |
Good to see! Adding the issues would be great. Let's discuss about 3.9 support on the PR that you submitted. |
Hello, not to step on any toes here but I would like to know if this effort has stalled (understandably, time is always a valuable commodity)? The matlab library I am a maintainer of is currently going through a major documentation pass and to that end I have allocated some time to working on tooling. As such, I think this would be a good place to start as it will help in closing #52, #54, #212, and #222. Those four issues are currently my target to get done (perhaps in one fell swoop along with this one) as they would be very useful for our documentation. I have started an attempt to implement |
Hi. It was definitely stalled. I have had zero time to work on this project unfortunately. This week, I'll give it a shot. The most difficult issue to solve is still #222. |
If you want a starting point re: classes, I have now gotten most of a classdef parser written here https://github.com/apozharski/matlabdomain/blob/only-enums/sphinxcontrib/textmate_parser.py including argument blocks. I am happy to continue work on it and submit a pr or you can pull out whatever is useful. |
As an aside there are definitely some bugs in the textmate parser (watermarkhu/textmate-grammar-python#66 (comment) for example), though after digging I suspect they are in the underlying grammar maintained by mathworks. I am currently looking at a possible fix for it though @watermarkhu may have the inner track on understanding the grammar format. |
Thanks. I will take this as a starting point. It looks very useful already. If you have any PR's, I'll work on the development branch |
@apozharski regarding priority of docstrings, they are as follows:
|
@apozharski I ran into an issue with class attributes and created an issue watermarkhu/textmate-grammar-python#67. In the current parsing of classdef / method / property attributes I reuse the same method: matlabdomain/sphinxcontrib/mat_types.py Line 1474 in 4d890d7
|
Yep that is what I thought was the case. Thanks for clarifying. I will do some cleanup and get the routines to check for non-consecutive comments and submit a PR to your dev branch. |
@joeced After spending a few too many hours trying to fix the mathworks provided textmate grammar I am convinced that it is not worth continuing to force a square peg, a parsing system primarily designed for syntax highlighting, into the round hole that is using it for extracting structure. After doing some research there is a better alternative that is https://github.com/acristoffers/tree-sitter-matlab which is a matlab grammar for Over the last couple days I quickly threw together a working prototype with support for I believe the full suite of matlab syntax ( I think this is the direction this project should go in as it does not require us to fix yet more bugs in |
Hi @apozharski, thank you very much for looking into this. Taking the time and effort. Much appreciated! I'm away from a computer at the moment, but will get back to you in 2 weeks. |
hi @apozharski - I'm back now! Did you have any time to work on using tree-sitter-matlab and would you try to make a pull request?
Again - thanks for looking into this! |
Hello @joeced, The latest work I have done on this is slowly beginning to fix things to get the tests back in working order (and I have found a bug in Absolutely no problem! |
This is resolved by the move to the tree-sitter backend #261. |
Hi @joeced, great work on maintaining this repo.
A year ago, I wanted to contribute to support argument blocks. However, I've found that the logic in
mat_types.py
based on the Pygments tokens to be very hard to work with, and a bit unstable.Following MathWorks' support for VSCode, I had started on working a parser based on TextMate grammars using Python, which is used for syntax highlighting in VSCode. MathWorks is now also maintaining the MATLAB grammar.
The package is available at https://github.com/watermarkhu/textmate-grammar-python. If you are interested, I think this can be a good replacement for the currently in-house parsing of
matlabdomain
. The benefit of using TextMate grammar is that 1) due to its nested nature, the output is already a syntax tree and 2) parsing is now officially supported by MathWorks and the contributors of the VSCode extension.On a different topic, due to some requirements, I will need to have an auto-documenter that is compatible with markdown docstrings. To this end, I've already started work on a new extension that is dependent on the myst-parser and based on autodoc2. I would love to get in touch with you to understand the
matlabdomain
better to see what I can re-use.The text was updated successfully, but these errors were encountered: