diff --git a/src/beginning/assets.md b/src/beginning/assets.md index cc087673..d2d31917 100644 --- a/src/beginning/assets.md +++ b/src/beginning/assets.md @@ -42,7 +42,7 @@ you'll also see a preview of the model. ![preview](preview.gif) -The maximum amount of asset instances is not limited by the engine but it is by the memory and CPU resources of your PC. +The maximum number of asset instances is not limited by the engine but it is by the memory and CPU resources of your PC. Note that the engine does try to reuse data across instances as much as possible. You can also instantiate assets dynamically from your code. Here's an example of that for a Model: diff --git a/src/beginning/editor_overview.md b/src/beginning/editor_overview.md index 8f02ba63..8684ccfd 100644 --- a/src/beginning/editor_overview.md +++ b/src/beginning/editor_overview.md @@ -8,7 +8,7 @@ This chapter will guide you through the basics, advanced topics will be covered ## Windows -When you open the editor for the first time you may be confused by the amount of windows, buttons, lists, etc. you'll be presented +When you open the editor for the first time you may be confused by the number of windows, buttons, lists, etc. you'll be presented with. Each window serves a different purpose, but all of them work together to help you make your game. Let's take a look at a screenshot of the editor and learn what each part of it is responsible for (please note that this can change over time, because development is quite fast and images can easily become outdated): diff --git a/src/code/snippets/src/borrowck/mod.rs b/src/code/snippets/src/borrowck/mod.rs index 91a88f6f..f946b4fa 100644 --- a/src/code/snippets/src/borrowck/mod.rs +++ b/src/code/snippets/src/borrowck/mod.rs @@ -32,7 +32,7 @@ impl ScriptTrait for MyScript { let some_other_node_ref = mbc.try_get_mut(self.some_other_node).unwrap(); let yet_another_node_ref = mbc.try_get_mut(self.yet_another_node).unwrap(); - // We can borrow the same node immutably pretty much infinite amount of times, if it wasn't + // We can borrow the same node immutably pretty much infinite number of times, if it wasn't // borrowed mutably. let some_node_ref_2 = mbc.try_get(self.some_node).unwrap(); } diff --git a/src/code/snippets/src/scene/mesh.rs b/src/code/snippets/src/scene/mesh.rs index fc56cf9b..15b20cb8 100644 --- a/src/code/snippets/src/scene/mesh.rs +++ b/src/code/snippets/src/scene/mesh.rs @@ -68,7 +68,7 @@ fn create_procedural_mesh(scene: &mut Scene, resource_manager: ResourceManager) .with_surfaces(vec![SurfaceBuilder::new(SurfaceResource::new_ok( ResourceKind::Embedded, // Our procedural mesh will have a form of squashed cube. - // A mesh can have unlimited amount of surfaces. + // A mesh can have unlimited number of surfaces. SurfaceData::make_cube(Matrix4::new_nonuniform_scaling(&Vector3::new( 25.0, 0.25, 25.0, ))), diff --git a/src/editor/plugins.md b/src/editor/plugins.md index 3c8eaaf5..5f9a664a 100644 --- a/src/editor/plugins.md +++ b/src/editor/plugins.md @@ -78,7 +78,7 @@ approach for visualization is just a custom structure with a few methods: ``` `sync_to_model` method can be called on every frame in `update` method of the interaction mode (see below) - it tracks -the amount of scene nodes representing points of the line and if there's mismatch, it recreates the entire set. +the number of scene nodes representing points of the line and if there's mismatch, it recreates the entire set. `remove_points` should be used when the gizmo is about to be deleted (usually together with the interaction mode). All interaction with scene nodes should be performed using interaction modes. Interaction mode is a tiny abstraction layer, diff --git a/src/editor/property_editors.md b/src/editor/property_editors.md index 6eff3d5d..60ee68cd 100644 --- a/src/editor/property_editors.md +++ b/src/editor/property_editors.md @@ -42,7 +42,7 @@ following example shows a typical usage: {{#include ../code/snippets/src/editor/prop_editors.rs:add_enum_property_editor}} ``` -As you can see, your enumeration needs a decent amount of trait implementations, hopefully all of them can be derived. +As you can see, your enumeration needs a decent number of trait implementations, hopefully all of them can be derived. ### Inheritable Properties diff --git a/src/performance/index.md b/src/performance/index.md index a08277b2..4380778e 100644 --- a/src/performance/index.md +++ b/src/performance/index.md @@ -8,7 +8,7 @@ For the vast majority of cases, standard engine approaches are perfectly fine. Theoretically, the ECS approach _can_ give you better performance, but lets at first see where ECS is beneficial, and why classic approach is still viable. The ECS is beneficial _only_ in cases where you have to process **ten or hundreds thousands** objects every frame, the performance gain of cache friendliness can be significant -in such cases. But let's stop for a second and ask ourselves again: how _often_ games have such huge amount of objects +in such cases. But let's stop for a second and ask ourselves again: how _often_ games have such huge number of objects that has to be processed every frame? There are very few examples of such games: - Strategy games - at some extent, because there are very few games that allows you to control tens of thousands @@ -20,7 +20,7 @@ Note that the list does not include games with vast worlds, why so? The reason i process every tiny object in the world at once, instead they split the world in small chunks and process only few chunks at once, those where the player is present. -The rest of genres operate on a tiny amount of object compared to those up above, maybe a few hundreds at max. +The rest of genres operate on a tiny number of objects compared to those up above, maybe a few hundreds at max. One might say - hey, each object could contain lots of tiny "moving parts", what's about them? Usually each object contains up to 10-15 sub-parts, which leads us to few thousands of "atomic" object. Is it much? Not really. diff --git a/src/physics/physics.md b/src/physics/physics.md index aeb7de5e..f8db04ee 100644 --- a/src/physics/physics.md +++ b/src/physics/physics.md @@ -19,6 +19,6 @@ rigid bodies in the world. There is a very few differences between 3D and 2D physics, the most obvious is that 2D physics does simulation only in oXY plane (the plane of the screen). 2D physics has less collider shapes available since some 3D shapes degenerate in -2D, for example cylinder 3D shape in 2D is just a rectangle. There is also lesser amount of joints available in 2D, +2D, for example cylinder 3D shape in 2D is just a rectangle. There is also lesser number of joints available in 2D, there is no revolute joint for example. Unlike 3D physics entities, 2D physics entities exist in the separate `scene::dim2` module. \ No newline at end of file diff --git a/src/physics/ragdoll.md b/src/physics/ragdoll.md index cd4bc44d..f1fefdd0 100644 --- a/src/physics/ragdoll.md +++ b/src/physics/ragdoll.md @@ -36,7 +36,7 @@ scene nodes in the world viewer, located under a `Ragdoll` scene node: ![ragdoll result](ragdoll2.png) -As you can see, the amount of entities you'd have to create and configure manually is quite high. Keep in mind, that +As you can see, the number of entities you'd have to create and configure manually is quite high. Keep in mind, that ragdoll wizard can't generate perfect ragdoll, because of lack of information. The generated ragdoll will most likely require some minor tweaks (mostly joint angular limits). diff --git a/src/physics/ray.md b/src/physics/ray.md index 3b476f0b..21b0d04d 100644 --- a/src/physics/ray.md +++ b/src/physics/ray.md @@ -26,8 +26,8 @@ the `collider` and fetch its `parent` field: `graph[collider].parent()`. - `normal` - a normal at the intersection position in world coordinates. - `position` - a position of the intersection in world coordinates. - `feature` - additional data that contains a kind of the feature with which intersection was detected as well as its -index. FeatureId::Face might have index that is greater than amount of triangles in a triangle mesh, this means that -intersection was detected from "back" side of a face. To "fix" that index, simply subtract amount of triangles of a +index. FeatureId::Face might have index that is greater than number of triangles in a triangle mesh, this means that +intersection was detected from "back" side of a face. To "fix" that index, simply subtract number of triangles of a triangle mesh from the value. - `toi` - (`time of impact`) a distance from ray's origin to `position`. @@ -40,5 +40,5 @@ relatively slow and could be sped up a lot by using static array on stack: {{#include ../code/snippets/src/scene/ray.rs:do_static_ray_cast}} ``` -`usage_example` shows how to use the `do_static_ray_cast` function - all you need to do is to specify maximum amount of -intersections you're interested in as a generic parameter. \ No newline at end of file +`usage_example` shows how to use the `do_static_ray_cast` function - all you need to do is to specify maximum number of +intersections you're interested in as a generic parameter. diff --git a/src/rendering/materials.md b/src/rendering/materials.md index 90bb6236..5d325d08 100644 --- a/src/rendering/materials.md +++ b/src/rendering/materials.md @@ -4,8 +4,8 @@ Material defines a set of values for a shader. Materials usually contains textur textures. Each parameter can be changed in runtime giving you the ability to create animated materials. However, in practice, most materials are static, this means that once it's created, it won't be changed anymore. -Please keep in mind that the actual "rules" of drawing an entity are stored in the shader, -**material is only a storage** for specific uses of the shader. +Please keep in mind that the actual "rules" of drawing an entity are stored in the shader, +**material is only a storage** for specific uses of the shader. Multiple materials can share the same shader, for example standard shader covers 95% of most common use cases, and it is shared across multiple materials. The only difference are property values, for example you can draw @@ -16,7 +16,7 @@ multiple objects with the same material efficiently. ## Performance -It is very important re-use materials as much as possible, because the amount of materials used per frame +It is very important re-use materials as much as possible, because the number of materials used per frame significantly correlates with performance. The more unique materials you have per frame, the more work the renderer and video driver need in order to render a frame and more time the frame will require for rendering, thus lowering your FPS. @@ -27,7 +27,7 @@ The engine offers a standard PBR material, PBR stands for "Physically-Based Rend of shading which is very close to materials in real world (to some extent of course). The standard material can cover 95% of use cases, and it is suitable for almost any kind of game, except maybe -some cartoon-ish or stylized games. +some cartoon-ish or stylized games. The standard material has quite a lot of properties that can be used to fully utilize the power of PBR rendering: @@ -37,26 +37,26 @@ mesh ([see below](#transparency)) - **diffuseTexture** - a 2D texture containing the unlit "basic" colors of your object, this is the most commonly used texture. For example, you can assign a brick wall texture to this property and your object will look like a brick wall. -- **normalTexture** - a 2D texture containing per-pixel normal vectors. +- **normalTexture** - a 2D texture containing per-pixel normal vectors. - **metallicTexture** - a 2D texture containing per-pixel metallic factor, where 0 - dielectric, 1 - metal. -In simple words it defines whether your object reflects (1.0) the environment or not (0.0). -- **roughnessTexture** - a 2D texture containing per-pixel roughness factor, where 0 - completely flat, 1 - +In simple words it defines whether your object reflects (1.0) the environment or not (0.0). +- **roughnessTexture** - a 2D texture containing per-pixel roughness factor, where 0 - completely flat, 1 - very rough. - **heightTexture** - a 2D texture containing per-pixel displacement value, it is used with parallax mapping to crate an effect of volume on a flat surface. - **emissionTexture** - a 2D texture containing per-pixel emission lighting. You could use this to create emissive -surfaces like small lamps on wall of sci-fi ship, or to create glowing eyes for your monsters that will scare +surfaces like small lamps on wall of sci-fi ship, or to create glowing eyes for your monsters that will scare the player. - **lightmapTexture** - a 2D texture containing per-pixel **static** lighting. It is used to apply precomputed light to your 3D models, and the most common use case is to lit a static object using a static light. Precomputed light is very cheap. The engine offers built-in lightmapper that can generate lightmaps for you. - **aoTexture** - a 2D texture containing per-pixel shading values, allows you to "bake" shadows in for your 3D object. -- **texCoordScale** - a 2D vector that allows you to scale texture coordinates used to sample the textures +- **texCoordScale** - a 2D vector that allows you to scale texture coordinates used to sample the textures mentioned above (except lightmaps, they're using separate texture coordinates) - **layerIndex** - a natural number that is used for decals masking, a decal will only be applied to your mesh -if and only if the decal has matching index. -- **emissionStrength** - a 3D vector that allows you to set the strength of emission per-channel (R, G, B) for +if and only if the decal has matching index. +- **emissionStrength** - a 3D vector that allows you to set the strength of emission per-channel (R, G, B) for your `emissionTexture` ## Transparency @@ -70,7 +70,7 @@ path on your mesh object. It could be done in this way: # core::pool::Handle, # scene::{mesh::RenderPath, node::Node, Scene}, # }; -# +# # fn set_forward_render_path(scene: &mut Scene, mesh_handle: Handle) { scene.graph[mesh_handle] .as_mesh_mut() @@ -85,7 +85,7 @@ if you need lighting, you will need to use custom shader for that! ## Material import When you're loading a 3D model in the engine, the engine tries to convert the materials stored inside to standard -material. In most cases there is no way to create 100% matching material on the fly, instead the engine tries +material. In most cases there is no way to create 100% matching material on the fly, instead the engine tries to do its best to make sure the material will be imported as closely as possible to the original one. Various 3D modelling tools use different material system, but all of them allow you to export your 3D model in one of the commonly used formats (such as FBX). @@ -93,8 +93,8 @@ used formats (such as FBX). ### Blender When using Blender, make sure you are using **Principled BSDF** material, it is the closest material that can be converted -to engine's standard material at almost 100% fidelity. +to engine's standard material at almost 100% fidelity. ### 3Ds max -It highly depends on the version of the 3Ds max, but in general the default material should work fine. \ No newline at end of file +It highly depends on the version of the 3Ds max, but in general the default material should work fine. diff --git a/src/rendering/shaders.md b/src/rendering/shaders.md index ed0d3f84..6d07c212 100644 --- a/src/rendering/shaders.md +++ b/src/rendering/shaders.md @@ -2,19 +2,19 @@ Shader is a set of programs that run directly on graphics adapter. Each program from the set is called _sub-shader_. Sub-shaders linked with render pass, each render pass defines "where" to draw an object. -"where" means that you can set up your own render pass and the renderer will use the sub-shader with +"where" means that you can set up your own render pass and the renderer will use the sub-shader with your render pass. For the ease of use there are a number of [predefined render passes](#predefined-render-passes). -Shaders have properties of various types that can be used together with materials to draw an object. +Shaders have properties of various types that can be used together with materials to draw an object. ## Shaders language -The engine uses GLSL shading language for every sub-shader. There are numerous GLSL guides over the +The engine uses GLSL shading language for every sub-shader. There are numerous GLSL guides over the internet, so there is no need to "re-post" the well documented info again. There are very few differences: -1) No need to define a version of the shader. Every shader source will be pre-processed, and it will +1) No need to define a version of the shader. Every shader source will be pre-processed, and it will get correct version automatically. Preprocessing is needed because the same shader could run on OpenGL and WebGL (OpenGL ES) which have some differences. 2) There is a "standard" library of useful methods which is automatically included in every shader source @@ -28,7 +28,7 @@ Shader has rigid structure that could be described in this code snippet: ```json ( - // A set of properties, there could be any amount of properties. + // A set of properties, there could be any number of properties. properties: [ ( // Each property must have a name. This name must match with respective @@ -128,7 +128,7 @@ This material instance can be used for rendering. For example, you can assign it Property is a named variable of some type. Properties are directly tied with the uniforms in the sub-shaders, for each you can have a property called `time`, and then you can define `uniform float time;` in your sub-shader -and the engine will pass a property value to that uniform for you before drawing an object. Properties placed in +and the engine will pass a property value to that uniform for you before drawing an object. Properties placed in a "global namespace", which means that every sub-shader has "access" to the properties. ## Built-in properties @@ -171,25 +171,25 @@ This list will be extended in future releases. ## Predefined render passes -Predefined render passes helps you to create your own shader without a need to create your own render pass +Predefined render passes helps you to create your own shader without a need to create your own render pass and to quickly start writing your shaders. -- **GBuffer** - A pass that fills a set with render target sized textures with various data about each rendered -object. These textures then are used for physically-based lighting. Use this pass when you want the standard +- **GBuffer** - A pass that fills a set with render target sized textures with various data about each rendered +object. These textures then are used for physically-based lighting. Use this pass when you want the standard lighting to work with your objects. -- **Forward** - A pass that draws an object directly in render target. This pass is very limiting, it does not +- **Forward** - A pass that draws an object directly in render target. This pass is very limiting, it does not support lighting, shadows, etc. It should be only used to render translucent objects. - **SpotShadow** - A pass that emits depth values for an object, later this depth map will be used to render shadows. - **PointShadow** - A pass that emits distance from a fragment to a point light, later this depth map will be used to render shadows. -- **DirectionalShadow** - A pass that emits depth values for an object, later this depth map will be used to render +- **DirectionalShadow** - A pass that emits depth values for an object, later this depth map will be used to render shadows for directional light sources using cascaded shadow mapping. ## Drawing parameters Drawing parameters defines which GPU functions to use and at which state. For example, to render transparent objects you need to enable blending with specific blending rules. Or you need to disable culling to draw objects -from both sides. This is when draw parameters come in handy. +from both sides. This is when draw parameters come in handy. There are relatively large list of drawing parameters, and it could confuse a person who didn't get used to work with graphics. The following list should help you to use drawing parameters correctly. diff --git a/src/resources/hot_reloading.md b/src/resources/hot_reloading.md index b187d050..85042124 100644 --- a/src/resources/hot_reloading.md +++ b/src/resources/hot_reloading.md @@ -1,6 +1,6 @@ # Asset Hot Reloading -Fyrox supports asset hot reloading for most of the supported asset types. Hot reloading is a very useful feature that +Fyrox supports asset hot reloading for most of the supported asset types. Hot reloading is a very useful feature that allows you to reload assets from disk when they're changing. For example, you can change a texture, save it and the engine will automatically reload it and the changes will reflect in the game (and the editor). This section of the book explains how asset hot reloading works for specific asset types and what to expect from it. @@ -8,7 +8,7 @@ explains how asset hot reloading works for specific asset types and what to expe ## Textures Content of textures will be automatically reloaded when their source files are changed. Textures loading is usually quite -fast and even large amount of changed textures shouldn't cause significant lags. +fast and even large number of changed textures shouldn't cause significant lags. ## Sound @@ -17,23 +17,23 @@ when a buffer is reloaded, this happens because of a sudden change of amplitude could be quite slow for large sounds (such a music), since usually sound buffers are encoded with some algorithm and this data needs to be decoded when reloading. -## Models +## Models -Model resource (which is prefab also) supports hot reloading as well, but with some small limitations. +Model resource (which is prefab also) supports hot reloading as well, but with some small limitations. -If a node in FBX or GLTF model changes its name, then its instance in the running game won't receive the changes from -the source file. This happens, because the engine uses object name to search for the "ancestor" from which it then takes +If a node in FBX or GLTF model changes its name, then its instance in the running game won't receive the changes from +the source file. This happens, because the engine uses object name to search for the "ancestor" from which it then takes the data. If you swap names between two or more objects, their properties will be swapped in the game also. This issue -does not exist if you're changing names in native engine prefabs. +does not exist if you're changing names in native engine prefabs. Hierarchy changes in a source file will be reflected in all instances, however it could not work correctly if you're changing -hierarchy in FBX or GLTF model if there are duplicated names. This issue does not exist if you're changing names in native +hierarchy in FBX or GLTF model if there are duplicated names. This issue does not exist if you're changing names in native engine prefabs. Objects deleted in models will be also deleted in the running game, which could result in crash if you're expecting the objects to be always alive. -Any change in a leaf prefab in a chain of hierarchical prefabs will cause an update pass of its first ancestor. In other -words, if you have a level with a room prefab, and this room prefab has chair prefab instances in it then any change in the -chair prefab source file will be applied to the chair prefab itself, then its instances in the room prefab. See -[property inheritance](../scene/inheritance.md) chapter for more info. \ No newline at end of file +Any change in a leaf prefab in a chain of hierarchical prefabs will cause an update pass of its first ancestor. In other +words, if you have a level with a room prefab, and this room prefab has chair prefab instances in it then any change in the +chair prefab source file will be applied to the chair prefab itself, then its instances in the room prefab. See +[property inheritance](../scene/inheritance.md) chapter for more info. diff --git a/src/scene/decal_node.md b/src/scene/decal_node.md index 22a97fb7..054849aa 100644 --- a/src/scene/decal_node.md +++ b/src/scene/decal_node.md @@ -23,7 +23,7 @@ supported. ## Rendering Currently, the engine supports only _deferred decals_, which means that decals modify the information stored in -G-Buffer. This fact means that decals will be lit correctly with other geometry in the scene. However, if you +G-Buffer. This fact means that decals will be lit correctly with other geometry in the scene. However, if you have some objects in your scene that uses forward rendering path, your decals won't be applied to them. ## Bounds @@ -33,19 +33,19 @@ everything that got into OOB will be covered. Exact bounds can be set by tweakin If you want your decal to be larger, set its scale to some large value. To position a decal - use local position, to rotate - local rotation. -A decal defines a cube that projects a texture on every pixel of a scene that got into the cube. Exact cube size -is defined by decal's local scale. For example, if you have a decal with scale of (1.0, 2.0, 0.1) then the size of -the cube (in local coordinates) will be width = 1.0, height = 2.0 and depth = 0.1. The decal can be rotated as any +A decal defines a cube that projects a texture on every pixel of a scene that got into the cube. Exact cube size +is defined by decal's local scale. For example, if you have a decal with scale of (1.0, 2.0, 0.1) then the size of +the cube (in local coordinates) will be width = 1.0, height = 2.0 and depth = 0.1. The decal can be rotated as any other scene node. Its final size and orientation are defined by the chain of transformations of parent nodes. ## Layers There are situations when you want to prevent some geometry from being covered with a decal, to do that the engine -offers a concept of layers. A decal will be applied to a geometry if and only if they have matching layer index. This -allows you to create environment damage decals, and they won't affect dynamic objects since they're located on +offers a concept of layers. A decal will be applied to a geometry if and only if they have matching layer index. This +allows you to create environment damage decals, and they won't affect dynamic objects since they're located on different layers. ## Performance Current implementation of decals is relatively cheap, this allows you to create many decals on scene. However, you -should keep the amount of decals at a reasonable level. \ No newline at end of file +should keep the number of decals at a reasonable level. diff --git a/src/scene/light_node.md b/src/scene/light_node.md index 1114df4e..f13ee0c8 100644 --- a/src/scene/light_node.md +++ b/src/scene/light_node.md @@ -1,6 +1,6 @@ # Light node -The engine offers complex lighting system with various types of light sources. +The engine offers complex lighting system with various types of light sources. ## Light types @@ -10,7 +10,7 @@ There are three main types of light sources: directional, point, and spotlights. Directional light does not have a position, its rays are always parallel, and it has a particular direction in space. An example of directional light in real-life could be our Sun. Even if it is a point light, it is so far away from -the Earth, so we can assume that its rays are always parallel. Directional light sources are suitable for outdoor +the Earth, so we can assume that its rays are always parallel. Directional light sources are suitable for outdoor scenes. A directional light source could be created like this: @@ -30,7 +30,7 @@ like this: ### Point light Point light is a light source that emits lights in all directions, it has a position, but does not have an orientation. -An example of a point light source: light bulb. +An example of a point light source: light bulb. ```rust,no_run {{#include ../code/snippets/src/scene/light.rs:create_point_light}} @@ -38,7 +38,7 @@ An example of a point light source: light bulb. ### Spotlight -Spotlight is a light source that emits lights in cone shape, it has a position and orientation. An example of +Spotlight is a light source that emits lights in cone shape, it has a position and orientation. An example of a spotlight source: flashlight. ```rust,no_run @@ -51,7 +51,7 @@ a spotlight source: flashlight. Spot and point lights support light scattering effect. Imagine you're walking with a flashlight in a foggy weather, the fog will scatter the light from your flashlight making it, so you'll see the "light volume". Light scattering is -**enabled by default**, so you don't have to do anything to enable it. However, in some cases you might want to disable +**enabled by default**, so you don't have to do anything to enable it. However, in some cases you might want to disable it, you can do this either while building a light source or change light scattering options on existing light source. Here is the small example of how to do that. @@ -59,7 +59,7 @@ Here is the small example of how to do that. {{#include ../code/snippets/src/scene/light.rs:disable_light_scatter}} ``` -You could also change the amount of scattering per each color channel, using this you could imitate the +You could also change the amount of scattering per each color channel, using this you could imitate the [Rayleigh scattering](https://en.wikipedia.org/wiki/Rayleigh_scattering): ```rust,no_run @@ -69,8 +69,8 @@ You could also change the amount of scattering per each color channel, using thi ## Shadows By default, light sources cast shadows. You can change this by using `set_cast_shadows` method of a light source. You -should carefully manage shadows: shadows giving the most significant performance impact, you should keep the amount of -light sources that can cast shadows at lowest possible amount to keep performance at good levels. You can also turn +should carefully manage shadows: shadows giving the most significant performance impact, you should keep the number of +light sources that can cast shadows at lowest possible to keep performance at good levels. You can also turn on/off shadows when you need: ```rust,no_run @@ -80,11 +80,11 @@ on/off shadows when you need: Not every light should cast shadows, for example a small light that a player can see only in a distance can have shadows disabled. You should set the appropriate values depending on your scene, just remember: the fewer the shadows the better the performance. The most expensive shadows are from point lights, the less, from spotlights and directional -lights. +lights. ## Performance Lights are not cheap, every light source has some performance impact. As a general rule, try to keep the amount of light sources at reasonable levels and especially try to avoid creating tons of light sources in a small area. Keep in mind that the less area the light needs to "cover", the higher the performance. This means that you can have -tons of small light sources for free. \ No newline at end of file +tons of small light sources for free. diff --git a/src/scene/particle_system_node.md b/src/scene/particle_system_node.md index adfcfb3e..a077f0d9 100644 --- a/src/scene/particle_system_node.md +++ b/src/scene/particle_system_node.md @@ -1,8 +1,8 @@ -# Particle system +# Particle system Particle system is a scene node that is used to create complex visual effects (VFX). It operates on huge amount -of particles at once allowing you to do complex simulation that involves large amount of particles. Typically, -particle systems are used to create following visual effects: smoke, sparks, blood splatters, steam, etc. +of particles at once allowing you to do complex simulation that involves large number of particles. Typically, +particle systems are used to create following visual effects: smoke, sparks, blood splatters, steam, etc. ![smoke](./particle_system_example.png) @@ -11,7 +11,7 @@ particle systems are used to create following visual effects: smoke, sparks, blo Particle system uses _single_ texture for every particle in the system, only Red channel is used. Red channel interpreted as an alpha for all particles. -Every particle is affected by `Acceleration` parameters of the particle system. It defines acceleration +Every particle is affected by `Acceleration` parameters of the particle system. It defines acceleration (in m/s2) that will affect velocities of every particle. It is used to simulate gravity. ### Particle @@ -21,10 +21,10 @@ has the following properties: - `Position` - defines a position in _local_ coordinates of particle system (this means that if you rotate a particle system, all particles will be rotated too). -- `Velocity` - defines a speed vector (in local coordinates) that will be used to modify local position of the particle +- `Velocity` - defines a speed vector (in local coordinates) that will be used to modify local position of the particle each frame. - `Size` - size (in meters) of the square shape of the particle. -- `Size Modifier` - a numeric value (in meters per second), that will be added to the Size at each frame, it is used +- `Size Modifier` - a numeric value (in meters per second), that will be added to the Size at each frame, it is used to modify size of the particles. - `Lifetime` - amount of time (in seconds) that the particle can be active for. - `Rotation` - angle (in radians) that defines rotation around particle-to-camera axis (clockwise). @@ -33,8 +33,8 @@ to modify size of the particles. ### Emitters -Particle system uses _emitters_ to define a set of zones where particles will be spawned, it also defines initial ranges of -parameters of particles. Particle system must have at least one emitter to generate particles. +Particle system uses _emitters_ to define a set of zones where particles will be spawned, it also defines initial ranges of +parameters of particles. Particle system must have at least one emitter to generate particles. Emitter can be one of the following types: @@ -46,19 +46,19 @@ Each emitter have fixed set of parameters that affects _initial_ values for ever - `Position` - emitter have its own _local_ position (position relative to parent particle system node), this helps you to create complex particle systems that may spawn particles from multiple zones in space at once. -- `Max Particles` - maximum amount of particles available for spawn. By default, it is `None`, which says that there is +- `Max Particles` - maximum number of particles available for spawn. By default, it is `None`, which says that there is no limit. - `Spawn Rate` - rate (in units per second) defines how fast the emitter will spawn particles. - `Lifetime Range` - numeric range (in seconds) for particle lifetime values. The lower the beginning of the range the less spawned particles will live, and vice versa. - `Size Range` - numeric range (in meters) for particle size. - `Size Modifier Range` - numeric range (in meters per second, m/s) for particle size modifier parameter. -- `X/Y/Z Velocity Range` - a numeric range (in meters per second, m/s) for a respective velocity axis (X, Y, Z) +- `X/Y/Z Velocity Range` - a numeric range (in meters per second, m/s) for a respective velocity axis (X, Y, Z) that defines initial speed along the axis. - `Rotation Range` - a numeric range (in radians) for initial rotation of a new particle. - `Rotation Speed Range` - a numeric range (in radians per second, rad/s) for rotation speed of a new particle. -**Important:** Every range (like Lifetime Range, Size Range, etc.) parameter generates _random_ value for respective +**Important:** Every range (like Lifetime Range, Size Range, etc.) parameter generates _random_ value for respective parameter of a particle. You can tweak the seed of current random number generator (`fyrox::core::thread_rng()`) to ensure that generated values will be different each time. @@ -69,9 +69,9 @@ There are multiple ways of creating a particle system, pick one that best suits ### Using the editor The best way to create a particle system is to configure it in the editor, creating from code is possible too (see below), -but way harder and may be not intuitive, because of the large amount of parameters. The editor allows you see the result -and tweak it very fast. Create a particle system by `Create -> Particle System` and then you can start editing its -properties. By default, new particle system has one Sphere particle emitter, you can add new emitters by clicking `+` +but way harder and may be not intuitive, because of the large number of parameters. The editor allows you see the result +and tweak it very fast. Create a particle system by `Create -> Particle System` and then you can start editing its +properties. By default, new particle system has one Sphere particle emitter, you can add new emitters by clicking `+` button at the right of `Emitters` property in the Inspector (or remove by clicking `-`). Here's a simple example: ![particle system](./particle_system.png) @@ -88,7 +88,7 @@ You can also create particle systems from code (in case if you need some procedu ``` This code creates smoke effect with smooth dissolving (by using color-over-lifetime gradient). Please refer to -[API docs](https://docs.rs/fyrox/latest/fyrox/scene/particle_system/index.html) for particle system for more information. +[API docs](https://docs.rs/fyrox/latest/fyrox/scene/particle_system/index.html) for particle system for more information. ### Using prefabs @@ -97,27 +97,27 @@ particle system and then [instantiate](../resources/model.md#instantiation) it t ## Soft particles -Fyrox used special technique, called soft particles, that smooths sharp transitions between particles and scene geometry: +Fyrox used special technique, called soft particles, that smooths sharp transitions between particles and scene geometry: ![soft particles](./soft_particles.png) -This technique especially useful for effects such as smoke, fog, etc. where you don't want to see the "edge" between +This technique especially useful for effects such as smoke, fog, etc. where you don't want to see the "edge" between particles and scene geometry. You can tweak this effect using `Soft Boundary Sharpness Factor`, the larger the value the more "sharp" the edge will be and vice versa. ## Restarting emission -You can "rewind" particle systems in the "initial" state by calling `particle_system.clear_particles()` method, it +You can "rewind" particle systems in the "initial" state by calling `particle_system.clear_particles()` method, it will remove all generated particles and emission will start over. ## Enabling or disabling particle systems By default, every particle system is enabled. Sometimes there is a need to create a particle system, but not enable -it (for example for some delayed effect). You can achieve this by calling `particle_system.set_enabled(true/false)` +it (for example for some delayed effect). You can achieve this by calling `particle_system.set_enabled(true/false)` method. Disabled particle systems will still be drawn, but emission and animation will be stopped. To hide particle system completely, use `particle_system.set_visibility(false)` method. -## Performance +## Performance Particle systems using special renderer that optimized to draw millions of particles with very low overhead, however particles simulated on CPU side and may significantly impact overall performance when there are many particle systems @@ -126,4 +126,3 @@ with lots of particles in each. ## Limitations Particle systems does not interact with lighting, this means that particles will not be lit by light sources in the scene. - diff --git a/src/scene/terrain_node.md b/src/scene/terrain_node.md index 1fafb74b..9737f6e7 100644 --- a/src/scene/terrain_node.md +++ b/src/scene/terrain_node.md @@ -23,7 +23,7 @@ while Y is stored in a separate array which is then used to modify heights of ce Layer is a material + mask applied to terrain's mesh. Mask is a separate, greyscale texture that defines in which parts of the terrain the material should be visible or not. White pixels in the mask makes the material to be visible, black - -completely transparent, everything between helps you to create smooth transitions between layers. Here's a simple +completely transparent, everything between helps you to create smooth transitions between layers. Here's a simple example of multiple layers: ![terrain layers layout](./terrain_layers_layout.png) @@ -37,14 +37,14 @@ Each layer uses separate material, which can be edited from respective property ## Creating terrain in the editor -You can create a terrain node by clicking `Create -> Terrain`. It will create a terrain with fixed width, height, +You can create a terrain node by clicking `Create -> Terrain`. It will create a terrain with fixed width, height, and resolution (see [limitations](./terrain_node.md#limitations-and-known-issues)). Once the terrain is created, select -it in the World Viewer and click on Hill icon on the toolbar. This will enable terrain editing, brush options panel +it in the World Viewer and click on Hill icon on the toolbar. This will enable terrain editing, brush options panel should also appear. See the picture below with all the steps: ![terrain editing](./terrain_editing.png) -The green rectangle on the terrain under the cursor represents current brush. You can edit brush options in the +The green rectangle on the terrain under the cursor represents current brush. You can edit brush options in the `Brush Options` window: ![brush options](./brush_options.png) @@ -53,9 +53,9 @@ The green rectangle on the terrain under the cursor represents current brush. Yo radius appears. When a rectangular brush is select, controls for its width and length appear. The size of the green rectangle changes to reflect the size of the brush based on these controls. - *Mode:* Select the terrain editing operation that the brush should perform. - - *Raise or Lower:* Modifies the existing value by a fixed amount. When the amount is positive, the value is - increased. When the amount is negative, the value is decreased. When the brush target is "Height Map", this can be - to raise or lower the terrain. When the `Shift` key is held at the start of a brush stroke, the amount of raising or lowering + - *Raise or Lower:* Modifies the existing value by a fixed amount. When the number is positive, the value is + increased. When the number is negative, the value is decreased. When the brush target is "Height Map", this can be + to raise or lower the terrain. When the `Shift` key is held at the start of a brush stroke, the number of raising or lowering is negated, so a raise operation becomes a lowering operation. - *Assign Value:* Replaces the existing value with a given value. For example, if you want to create a plateau with land of a specific height, you can select this mode and type in the height you want as the brush value. @@ -171,36 +171,36 @@ Here is an example of `BrushContext` in use: ``` As you can see there is quite a lot of code, ideally you should use editor all the times, because handling everything -from code could be very tedious. The result of its execution (if all textures are set correctly) could be something +from code could be very tedious. The result of its execution (if all textures are set correctly) could be something like this (keep in mind that terrain will be random everytime you run the code): ![terrain from code](./terrain_random.png) ## Physics -By default, terrains does not have respective physical body and shape, it should be added manually. Create a static +By default, terrains does not have respective physical body and shape, it should be added manually. Create a static rigid body node with a collider with Heightmap shape ([learn more about colliders](../physics/collider.md)). Then attach -the terrain to the rigid body. Keep in mind that terrain's origin differs from Heightmap rigid body, so you need to offset -the terrain to match its physical representation. Enable physics visualization in editor settings to see physical shapes -and move terrain. Now to move the terrain you should move the body, instead of the terrain (because of parent-child +the terrain to the rigid body. Keep in mind that terrain's origin differs from Heightmap rigid body, so you need to offset +the terrain to match its physical representation. Enable physics visualization in editor settings to see physical shapes +and move terrain. Now to move the terrain you should move the body, instead of the terrain (because of parent-child [relations](../beginning/scene_and_scene_graph.md#local-and-global-coordinates)). -## Performance +## Performance -Terrain rendering complexity have linear dependency with the amount of layers terrain have. Each layer forces the engine -to re-render terrain's geometry with different textures and mask. Typical amount of layers is from 4 to 8. For example, +Terrain rendering complexity have linear dependency with the number of layers terrain have. Each layer forces the engine +to re-render terrain's geometry with different textures and mask. Typical number of layers is from 4 to 8. For example, a terrain could have the following layers: dirt, grass, rock, snow. This is a relatively lightweight scheme. In any case, you should measure frame time to understand how each new layer affects performance in your case. ## Chunking Terrain itself does not define any geometry or rendering data, instead it uses one or more chunks for that purpose. Each -chunk could be considered as a "sub-terrain". You can "stack" any amount of chunks from any side of the terrain. To do -that, you define a range of chunks along each axis. This is very useful if you need to extend your terrain in a particular +chunk could be considered as a "sub-terrain". You can "stack" any number of chunks from any side of the terrain. To do +that, you define a range of chunks along each axis. This is very useful if you need to extend your terrain in a particular direction. Imagine that you've created a terrain with just one chunk (`0..1` range on both axes), but suddenly you found -that you need to extend the terrain to add some new game locations. In this case you can change the range of chunks at -the desired axis. For instance, if you want to add a new location to the right from your single chunk, then you should -change `width_chunks` range to `0..2` and leave `length_chunks` as is (`0..1`). This way terrain will be extended, and +that you need to extend the terrain to add some new game locations. In this case you can change the range of chunks at +the desired axis. For instance, if you want to add a new location to the right from your single chunk, then you should +change `width_chunks` range to `0..2` and leave `length_chunks` as is (`0..1`). This way terrain will be extended, and you can start shaping the new location. ## Level-of-detail @@ -210,18 +210,18 @@ possible quality (defined by the resolution of height map and masks), while the rendered with the lowest quality. This effectively balances GPU load and allows you to render huge terrains with low overhead. -The main parameter that affects LOD system is `block_size` (`Terrain::set_block_size`), which defines size of the patch -that will be used for rendering. It is used to divide the size of the height map into a fixed set of blocks using +The main parameter that affects LOD system is `block_size` (`Terrain::set_block_size`), which defines size of the patch +that will be used for rendering. It is used to divide the size of the height map into a fixed set of blocks using quad-tree algorithm. -Current implementation uses modified version of CDLOD algorithm without patch morphing. Apparently it is not needed, +Current implementation uses modified version of CDLOD algorithm without patch morphing. Apparently it is not needed, since bilinear filtration in vertex shader prevents seams to occur. -Current implementation makes it possible to render huge terrains (64x64 km) with 4096x4096 heightmap resolution in about a +Current implementation makes it possible to render huge terrains (64x64 km) with 4096x4096 heightmap resolution in about a millisecond on average low-to-middle-end GPU. ## Limitations and known issues -There is no way to cut holes in the terrain yet, it makes impossible to create caves. There is also no way to create +There is no way to cut holes in the terrain yet, it makes impossible to create caves. There is also no way to create ledges, use separate meshes to imitate this. See [tracking issue](https://github.com/FyroxEngine/Fyrox/issues/351) for -more info. \ No newline at end of file +more info. diff --git a/src/sound/hrtf.md b/src/sound/hrtf.md index 5a645796..71c62459 100644 --- a/src/sound/hrtf.md +++ b/src/sound/hrtf.md @@ -1,17 +1,17 @@ -# Head Related Transfer Function +# Head Related Transfer Function Head Related Transfer Function (HRTF for short) is special audio processing technique that improves audio spatialization. By default, sound spatialization is very simple - volume of each audio channel (left and right) changes accordingly to orientation of the listener. While this simple and fast, it does not provide good audio spatialization - sometimes it is hard to tell from which direction the actual sound is coming from. To solve this issue, we can use head-related transfer function. Despite its scary, mathematical name, it is easy to understand what it's doing. Instead of uniformly -changing volume of all frequencies of the signal (as the naive spatialization does), it changes them separately for +changing volume of all frequencies of the signal (as the naive spatialization does), it changes them separately for each channel. The exact "gains" of each frequency of each channel is depends on the contents of head-related transfer function. This is done for each azimuth and elevation angles, which gives full picture of how audio signal from each direction travels to each ear. HRTF is usually recorded using a head model with ears with a microphone inside each ear. To capture head-related impulse -response (time domain) at a fixed distance and angle pair (azimuth and elevation), a very short impulse of sound is produced. +response (time domain) at a fixed distance and angle pair (azimuth and elevation), a very short impulse of sound is produced. Microphones inside each ear records the signal, and then HRIR (time domain) can be converted in HRTF (frequency domain). ## HRTF on practice @@ -30,7 +30,7 @@ Once it is loaded, all sounds in the scene will use the HRTF for rendering. The # graph::Graph, # sound::{self, HrirSphere, HrirSphereResource, HrirSphereResourceExt, HrtfRenderer, Renderer}, # }; -# +# fn use_hrtf(graph: &mut Graph) { let hrir_sphere = HrirSphereResource::from_hrir_sphere( HrirSphere::from_file("path/to/hrir.bin", sound::SAMPLE_RATE).unwrap(), "path/to/hrir.bin".into()); @@ -44,5 +44,5 @@ fn use_hrtf(graph: &mut Graph) { ## Performance HRTF is heavy. It is 5-6 times slower than the simple spatialization, so use it only on middle-end or high-end hardware. -HRTF performance is linearly dependent on the amount of sound sources: the more sound sources use HRTF, the worse performance -will be and vice versa. \ No newline at end of file +HRTF performance is linearly dependent on the number of sound sources: the more sound sources use HRTF, the worse performance +will be and vice versa. diff --git a/src/ui/basic_concepts/basic_concepts.md b/src/ui/basic_concepts/basic_concepts.md index a8f740e6..0f9c2181 100644 --- a/src/ui/basic_concepts/basic_concepts.md +++ b/src/ui/basic_concepts/basic_concepts.md @@ -4,12 +4,12 @@ This chapter should help you understand basic concepts lying in the foundation o ## Stateful -Stateful UI means that we can create and destroy widgets when we need to, it is the opposite approach of +Stateful UI means that we can create and destroy widgets when we need to, it is the opposite approach of immediate-mode or stateless UIs when you don't have long-lasting state for your widgets -(usually stateless UI hold its state only for one or two frames). +(usually stateless UI hold its state only for one or two frames). -Stateful UI is much more powerful and flexible, it allows you to have complex layout system without having to -create hacks to create complex layout as you'd do in immediate-mode UIs. It is also much faster in terms of +Stateful UI is much more powerful and flexible, it allows you to have complex layout system without having to +create hacks to create complex layout as you'd do in immediate-mode UIs. It is also much faster in terms of performance. Stateful UI is a must for complex user interfaces that requires rich layout and high performance. I'm not telling @@ -20,8 +20,8 @@ more info. The UI system is designed to be used in a classic model-view-controller MVC approach. Model in this case is your game state, view is the UI system, controller is your event handlers. In other words - the UI shows what happens in your game -and does not store any game-related information. This is quite old, yet powerful mechanism that decouples UI code from -game code very efficiently and allows you to change game code and UI code independently. +and does not store any game-related information. This is quite old, yet powerful mechanism that decouples UI code from +game code very efficiently and allows you to change game code and UI code independently. ## Node-based architecture @@ -33,8 +33,8 @@ we call button. Graphically it will look like this: ![Button](./button.svg) -On the right side of the image we can see the generic button and on the left side, we can see its hierarchical -structure. Such approach allows us to modify the look of the button as we wish, we can create a button with +On the right side of the image we can see the generic button and on the left side, we can see its hierarchical +structure. Such approach allows us to modify the look of the button as we wish, we can create a button with image background, or with any vector image, or even other widgets. The foreground can be anything too, it can also contain its own complex hierarchy, like a pair of an icon with a text and so on. @@ -42,13 +42,13 @@ contain its own complex hierarchy, like a pair of an icon with a text and so on. Every widget in the engine uses composition to build more complex widgets. All widgets (and respective builders) contains `Widget` instance inside, it provides basic functionality the widget such as layout information, hierarchy, default -foreground and background brushes (their usage depends on derived widget), render and layout transform and so on. +foreground and background brushes (their usage depends on derived widget), render and layout transform and so on. ## Component Querying -Many widgets provide component querying functionality - you can get an immutable reference to inner component by its type. It is -used instead of type casting in many places. Component querying is much more flexible compared to direct type casting. -For example, you may want to build a custom [Tree](../tree.md) widget, you want your CustomTree to inherit all the +Many widgets provide component querying functionality - you can get an immutable reference to inner component by its type. It is +used instead of type casting in many places. Component querying is much more flexible compared to direct type casting. +For example, you may want to build a custom [Tree](../tree.md) widget, you want your CustomTree to inherit all the functionality from the Tree, but add something new. The Tree widget can manage its children subtrees, but it needs to somehow get required data from subtree. Direct type casting would fail in this case, because now you have something like this: @@ -62,7 +62,7 @@ struct CustomTree { } ``` -On other hand, component querying will work fine, because you can query inner component (Tree in our case). Please note +On other hand, component querying will work fine, because you can query inner component (Tree in our case). Please note that this has nothing similar with ECS and stuff, it is made to circumvent Rust's lack of inheritance. ## Message passing @@ -73,8 +73,8 @@ widget to change its content to something new. This is done by sending a message There is no classic callbacks to handle various types of messages, which may come from widgets. Instead, you should write your own message dispatcher where you'll handle all messages. Why so? At first - decoupling, in this case business logic -is decoupled from the UI. You just receive messages one-by-one and do specific logic. The next reason is that any -callback would require context capturing which could be somewhat restrictive - since you need to share context with the +is decoupled from the UI. You just receive messages one-by-one and do specific logic. The next reason is that any +callback would require context capturing which could be somewhat restrictive - since you need to share context with the UI, it would force you to wrap it in `Rc>`/`Arc>`. Message dispatcher is very easy to write, all you need to do is to handle UI messages in `Plugin::on_ui_message` method: @@ -88,13 +88,13 @@ from which a message was come from. Then you do any actions you want. ### Message routing strategies -Message passing mechanism works in pair with various routing strategies that allows you to define how the message +Message passing mechanism works in pair with various routing strategies that allows you to define how the message will "travel" across the tree of nodes. 1. Bubble - a message starts its way from a widget and goes up on hierarchy until it reaches root node of hierarchy. Nodes that lies outside that path won't receive the message. This is the most important message routing strategy, that is used for **every** node by default. -2. Direct - a message passed directly to every node that are capable to handle it. There is actual routing in this +2. Direct - a message passed directly to every node that are capable to handle it. There is actual routing in this case. Direct routing is used in rare cases when you need to catch a message outside its normal "bubble" route. Bubble message routing is used to handle complex hierarchies of widgets with ease. Let's take a look at the button @@ -104,7 +104,7 @@ level of hierarchy up - to the button widget itself. This way the button widget ## Layout -The UI systems uses complex, yet powerful layout system that allows you to build complex user interfaces with +The UI systems uses complex, yet powerful layout system that allows you to build complex user interfaces with complex layout. Layout pass has two _recursive_ sub-passes: 1. Measurement - the sub-pass is used to fetch the desired size of each widget in hierarchy. Each widget in the hierarchy @@ -113,7 +113,7 @@ of a widget root of some hierarchy you need to recursively fetch the desired siz 2. Arrangement - the sub-pass is used to set final position and size of each widget in hierarchy. It uses desired size of every widget from the previous step to set the final size and relative position. This step is recursive. -Such separation in two passes is required because we need to know desired size of each node in hierarchy before we can +Such separation in two passes is required because we need to know desired size of each node in hierarchy before we can actually do an arrangement. ## Code-first and Editor-first approaches @@ -123,8 +123,8 @@ The UI system supports both ways of making a UI: 1) Code-first approach is used when your user interface is procedural and its appearance is heavily depends on your game logic. In this case you need to use various widget builder to create UIs. 2) Editor-first approach is used when you have relatively static (animations does not count) user interface, -that almost does not change in time. In this case you can use built-in WYSIWYG (what-you-see-is-what-you-get) -editor. See [Editor](../editor/editor.md) chapter for more info. +that almost does not change in time. In this case you can use built-in WYSIWYG (what-you-see-is-what-you-get) +editor. See [Editor](../editor/editor.md) chapter for more info. In case of code-first approach you should prefer so-called _fluent syntax_: this means that you can create your widget in series of nested call of other widget builders. In code, it looks something like this: @@ -137,7 +137,7 @@ This code snippet creates a button with an image and a text. Actually it creates complex hierarchy. The topmost widget in hierarchy is the `Button` widget itself, it has two children widgets: background image and a text. Background image is set explicitly by calling image widget builder with specific texture. The text is created implicitly, the button builder creates `Text` widget for you and attaches it to -the button. The structure of the button can contain _any_ amount of nodes, for example you can create a button +the button. The structure of the button can contain _any_ number of nodes, for example you can create a button that contains text with some icon. To do that, replace `.with_text("My Button")` with this: ```rust,no_run @@ -153,8 +153,8 @@ the fluent syntax: ## Limitations -UI system uses completely different kind of scenes - UI scenes, which are fully decoupled from game scenes. This means -that you can't incorporate UI widgets in a game scene. As a consequence, you don't have an ability to attach scripts to +UI system uses completely different kind of scenes - UI scenes, which are fully decoupled from game scenes. This means +that you can't incorporate UI widgets in a game scene. As a consequence, you don't have an ability to attach scripts to widgets - their logic is strictly defined in their backing code. This limitation is intentional, and it is here only for one reason - decoupling of UI code from game logic. Currently, there's only one right approach to make UIs - -to create widgets in your game plugin and sync the state of the widgets with game entities manually. \ No newline at end of file +to create widgets in your game plugin and sync the state of the widgets with game entities manually. diff --git a/src/ui/scroll_bar.md b/src/ui/scroll_bar.md index 98e1ba3f..4f9080ae 100644 --- a/src/ui/scroll_bar.md +++ b/src/ui/scroll_bar.md @@ -1,4 +1,4 @@ -# Scroll bar +# Scroll bar ![scroll bar](scroll_bar.gif) @@ -35,9 +35,9 @@ enum there. By default, scroll bar does not show its actual value, you can turn it on using `ScrollBarBuilder::show_value` method with `true` as the first argument. To change rounding of the value, use `ScrollBarBuilder::with_value_precision` -and provide the desired amount of decimal places there. +and provide the desired number of decimal places there. ## Step Scroll bar provides arrows to change the current value using a fixed step value. You can change it using -`ScrollBarBuilder::with_step` method. \ No newline at end of file +`ScrollBarBuilder::with_step` method. diff --git a/src/ui/scroll_panel.md b/src/ui/scroll_panel.md index b4fb579d..8e54c88c 100644 --- a/src/ui/scroll_panel.md +++ b/src/ui/scroll_panel.md @@ -1,9 +1,9 @@ -# Scroll panel +# Scroll panel Scroll panel widget does the same as [Scroll Viewer](scroll_viewer.md) widget, but it does not have any additional -widgets and does not have any graphics. It is a panel widget that provides basic scrolling functionality and -[Scroll Viewer](scroll_viewer.md) is built on top of it. Strictly speaking, scroll panel widget is used to arrange its -children widgets, so they can be offset by a certain amount of units from top-left corner. It is used to provide basic +widgets and does not have any graphics. It is a panel widget that provides basic scrolling functionality and +[Scroll Viewer](scroll_viewer.md) is built on top of it. Strictly speaking, scroll panel widget is used to arrange its +children widgets, so they can be offset by a certain number of units from top-left corner. It is used to provide basic scrolling functionality. ## Examples @@ -26,4 +26,4 @@ Calculates the scroll values to bring a desired child into view, it can be used ```rust {{#include ../code/snippets/src/ui/scroll_panel.rs:bring_child_into_view}} -``` \ No newline at end of file +```