-
Notifications
You must be signed in to change notification settings - Fork 15
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Automatic generation/sourcing of the C code? #24
Comments
Yeah, this definitely seems like a reasonable thing to try to do. One thing that may be worth considering here is that if there are any breaking changes in the structs or in the function signatures, if we just pull in the headers, then we should always be able to build and compile the cffi module no problem, and we'd have to observe the problems that would come up then at runtime. If we had a bit more tests in place, this might be good to do, but I'd worry about missing breaking changes otherwise. I would think we should be able to find the header locations through something like pkgconfig if we do want to do something like that. One thing that would be nice to do would be to autogenerate some of the extra headers (currently we have |
Yeah, with this approach it'd be good even to test basic things like a function does what it says. If the Python code was also autogenerated then they would always be synchronised. It'd be cool if we can get a github action that responds to wlroots' releases and triggers a new release so that packages get regenerated with the new wlroots.
The thing with this concern is that it is already there. Once the next wlroots release is out we'll have to manually go through and update the CDEF and corresponding Python interface anyway, so this would remove the need for the former (and autogeneration of the Python removes the need for the latter too). I kinda see this as "end game" for this kind of cffi module.
Yeah agreed. A pretty cool real-world example: https://github.com/nanomsg/nnpy/blob/master/generate.py. These guys seem to scrape the source header files to generate their CDEF. |
I had a play around with this with limited success. The ffi_build.py is below. Installed wlroots headers have some macros that need expanding before cffi gets them, so I added a preprocessor step using distutils, which actually worked quite well. The next issue I came across was that cffi wants some types to be defined already, and because wlroots interfaces with many other libraries (e.g. XCB, EGL), some of these need to be declared even if we don't want to pull in those headers. I didn't find a nice way to do this so I just prepend these to the Another issue has appeared though I do not have a solution for it. Compilation gives me this error:
However defining Anyway that's where I'm up to. It's been a bit tedious and I've hit a dead end so I'm gonna give up on this for a bit. It also makes me wonder: is it worth it? This approach would expose all of wlroots in pywlroots. Is that a good thing? Many parts should be added, sure, but maybe many other parts may never be needed. I don't know. import tempfile
from distutils.ccompiler import new_compiler
from pathlib import Path
from cffi import FFI
from pywayland.ffi_build import ffi_builder as pywayland_ffi
from xkbcommon.ffi_build import ffibuilder as xkb_ffi
include_dir = (Path(__file__).parent.parent / "include").resolve()
assert include_dir.is_dir(), f"missing {include_dir}"
include_dirs = ["/usr/include/pixman-1", include_dir]
wlr = Path("/usr/include/wlr")
assert wlr.exists()
headers = list(wlr.glob("**/*.h"))
SOURCE = "\n".join(f'#include <{header.as_posix()}>' for header in headers)
SOURCE += """
#include <xkbcommon/xkbcommon.h>
#include <xkbcommon/xkbcommon-keysyms.h>
#include <xkbcommon/xkbcommon-compose.h>
struct wl_listener_container {
void *handle;
struct wl_listener destroy_listener;
};
typedef void (*wrapped_log_func_t)(enum wlr_log_importance importance, const char *log_str);
wrapped_log_func_t py_callback = NULL;
void wrapped_log_callback(enum wlr_log_importance importance, const char *fmt, va_list args)
{
char formatted_str[4096];
vsnprintf(formatted_str, 4096, fmt, args);
py_callback(importance, formatted_str);
}
void wrapped_log_init(enum wlr_log_importance verbosity, wrapped_log_func_t callback)
{
if (callback == NULL)
{
wlr_log_init(verbosity, NULL);
}
else
{
py_callback = callback;
wlr_log_init(verbosity, wrapped_log_callback);
}
}
"""
VERSION = """
#define WLR_VERSION_MAJOR ...
#define WLR_VERSION_MINOR ...
#define WLR_VERSION_MICRO ...
"""
def gen_cdef():
"""
Read wlroots headers from system.
Adapted from nanomsg/nnpy
"""
BLOCKS = {'{': '}', '(': ')'}
IGNORED = ["log.h"]
# We want to put wlr/types first
TYPES = [p for p in headers if "wlr/types/" in p.as_posix()]
ordered = TYPES + [p for p in headers if p not in TYPES]
lines = []
for header in ordered:
if header.name in IGNORED:
continue
with header.open("r") as fd:
cont = ''
for ln in fd.readlines():
if not ln.strip():
continue
if cont == ',':
lines.append(ln)
cont = ''
continue
if cont in BLOCKS:
lines.append(ln)
if BLOCKS[cont] in ln:
cont = ''
continue
if ln.startswith('#include'):
continue
lines.append(ln)
cont = ln.strip()[-1]
return "".join(lines)
def preprocess_cdef(cdef):
"""
Preprocess headers to expand macros.
"""
with tempfile.TemporaryDirectory() as name:
temp = Path(name)
pre = temp / "pre.c"
post = temp / "post.c"
with pre.open('w') as fd:
fd.write(cdef)
compiler = new_compiler(verbose=1)
compiler.preprocessor = ["cc", "-E"]
compiler.preprocess(
pre.as_posix(),
post.as_posix(),
macros=[('WLR_USE_UNSTABLE', 1)],
include_dirs=include_dirs,
)
with post.open('r') as fd:
cdef_expanded = [
line for line in fd.readlines() if not line.startswith('#')
]
return '\n'.join(cdef_expanded)
def prepend_types(cdef):
"""
Some types must be declared inside CDEF so we can stick skeleton definitions for
those at the start.
"""
ints = [
'clockid_t',
'pid_t',
'time_t',
'xcb_pixmap_t',
'xcb_window_t',
'xcb_atom_t',
'EGLenum',
'GLenum',
'GLuint',
'dev_t',
]
prefix = "\n".join(f"typedef int {i};" for i in ints)
prefix += """\nenum wlr_axis_source {
...;
};\n"""
structs = [
'xcb_generic_event_t',
'pixman_region32_t',
'pixman_box32_t',
]
for s in structs:
struct_name = s[:-2] # without the _t suffix
prefix += "struct %s { ...; };\n" % struct_name
prefix += "typedef struct %s %s;\n" % (struct_name, s)
voids = [
"EGLDisplay",
"EGLContext",
"EGLSurface",
"EGLDeviceEXT",
"PFNEGLGETPLATFORMDISPLAYEXTPROC",
"PFNEGLCREATEIMAGEKHRPROC",
"PFNEGLDESTROYIMAGEKHRPROC",
"PFNEGLQUERYWAYLANDBUFFERWL",
"PFNEGLBINDWAYLANDDISPLAYWL",
"PFNEGLUNBINDWAYLANDDISPLAYWL",
"PFNEGLQUERYDMABUFFORMATSEXTPROC",
"PFNEGLQUERYDMABUFMODIFIERSEXTPROC",
"PFNEGLEXPORTDMABUFIMAGEQUERYMESAPROC",
"PFNEGLEXPORTDMABUFIMAGEMESAPROC ",
"PFNEGLDEBUGMESSAGECONTROLKHRPROC",
"PFNEGLQUERYDISPLAYATTRIBEXTPROC",
"PFNEGLQUERYDEVICESTRINGEXTPROC",
"EGLImageKHR",
"EGLint",
]
prefix += "\n".join(f"typedef void *{v};" for v in voids)
return VERSION + prefix + cdef
if __name__ == "__main__":
CDEF = gen_cdef()
CDEF = preprocess_cdef(CDEF)
CDEF = prepend_types(CDEF)
ffi_builder = FFI()
ffi_builder.set_source(
"wlroots._ffi",
SOURCE,
libraries=["wlroots"],
define_macros=[("WLR_USE_UNSTABLE", None)],
include_dirs=include_dirs,
)
ffi_builder.include(pywayland_ffi)
ffi_builder.include(xkb_ffi)
ffi_builder.cdef(CDEF)
ffi_builder.compile() PS. to compile, generated header files for all wlroots protocols need to be in pywlroots/include |
I took a look at how zig-wlroots circumvents the same issue. It has a scanner step that takes these extra protocol files as part of it's function signature to generate the relevant code. Maybe we can venture such a path too? I think pkg-config is the right way to detect where the headers are located. |
Each time I copy some code from wlroots to ffi_build.py I think, wouldn't it be nice if ffi_build.py could find the code that is accessed at
lib.<something>
and take it directly from wlroots.Do you think this is feasible? This could save a lot of work overall, though I'd have no clue where to start.
The text was updated successfully, but these errors were encountered: