You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Naming. We need a name for the View analog. I've been using RenderPass, but that's wrong. They are nodes in a graph butNode doesn't work because Node will be used elsewhere in the code - for now I'll be calling them UV.View...
Type safety. Unlike SwiftUI Views UV.View do not possess unlimited composablility. We need to emulate SwiftUI Views like Table where the children are of limited type.
Metal rendering code runs in two main phases "setup" and "render" (or "draw'). This is maybe badly represented by a single UV.View graph. Code running in setup time is expensive - things like compiling shaders, creating buffers, etc. Some of that setup can/should occur in the UV.Viewinit methods. We may need an an onSetup style modifier
Global state. There's really now way to implement SwiftUI State without some global state - Swift property wrappers are very limited and can't peek into the parent object they're contained in. So we need to store the current object (or a stack of objects) in a global variable so the body of the property wrapper can access it.
Architecture
Users will be able to create renderers and GPGPU workflows by composing UV.Views together. Like SwiftUI there will be State, EnvironmentValue, Binding etc (again, naming will be a problem here), much of the UV api will be presented as a handful of UV.View objects and modifiers that can be composed together.
Like SwiftUI View, UV.View is a protocol that is intended to be implemented by a value-types. By combining UV.Views together the user can create a graph that describes the rendering pipeline (or pipelines).
A graph of nodes that mirrors the UV.View graph will be created. These nodes are reference types that contain and manage and respond to changes of the graph state. These nodes also contain data to detect if the UV.View they represent has changed and portions of the graph need to be recompiled.
The graph will be processed in two phases. The setup phase is represented by the init methods of the UV.View objects. This should happen only once or when state changes.
The render phase (this needs a better name, "draw" is the wrong term to use for GPGPU graphs, maybe "workload" phase?) is handled by walking the graph and building up the state only used at render/compute time. This render-time state is invisible to the UV.View objects.
Further Exploration
Tying in Metal Stitchable graphs is an interesting extension to this. Being able to stitch together shaders without having to write shader code may be very cool.
reacted with thumbs up emoji reacted with thumbs down emoji reacted with laugh emoji reacted with hooray emoji reacted with confused emoji reacted with heart emoji reacted with rocket emoji reacted with eyes emoji
-
Ultraviolence (UV)
Design Notes
SwiftUI style declarative framework for Metal Rendering.
Goals
TODO
Example code:
Problems
View
analog. I've been usingRenderPass
, but that's wrong. They are nodes in a graph butNode
doesn't work becauseNode
will be used elsewhere in the code - for now I'll be calling themUV.View
...View
sUV.View
do not possess unlimited composablility. We need to emulate SwiftUI Views likeTable
where the children are of limited type.UV.View
graph. Code running in setup time is expensive - things like compiling shaders, creating buffers, etc. Some of that setup can/should occur in theUV.View
init
methods. We may need an anonSetup
style modifierState
without some global state - Swift property wrappers are very limited and can't peek into the parent object they're contained in. So we need to store the current object (or a stack of objects) in a global variable so the body of the property wrapper can access it.Architecture
Users will be able to create renderers and GPGPU workflows by composing
UV.View
s together. Like SwiftUI there will beState
,EnvironmentValue
,Binding
etc (again, naming will be a problem here), much of the UV api will be presented as a handful ofUV.View
objects and modifiers that can be composed together.Like SwiftUI
View
,UV.View
is a protocol that is intended to be implemented by a value-types. By combiningUV.View
s together the user can create a graph that describes the rendering pipeline (or pipelines).A graph of nodes that mirrors the
UV.View
graph will be created. These nodes are reference types that contain and manage and respond to changes of the graph state. These nodes also contain data to detect if theUV.View
they represent has changed and portions of the graph need to be recompiled.The graph will be processed in two phases. The setup phase is represented by the
init
methods of theUV.View
objects. This should happen only once or when state changes.The render phase (this needs a better name, "draw" is the wrong term to use for GPGPU graphs, maybe "workload" phase?) is handled by walking the graph and building up the state only used at render/compute time. This render-time state is invisible to the UV.View objects.
Further Exploration
Tying in Metal Stitchable graphs is an interesting extension to this. Being able to stitch together shaders without having to write shader code may be very cool.
Beta Was this translation helpful? Give feedback.
All reactions