diff --git a/chimney/src/main/scala-3/io/scalaland/chimney/dsl/TransformerInto.scala b/chimney/src/main/scala-3/io/scalaland/chimney/dsl/TransformerInto.scala index 57a11390f..347961b5b 100644 --- a/chimney/src/main/scala-3/io/scalaland/chimney/dsl/TransformerInto.scala +++ b/chimney/src/main/scala-3/io/scalaland/chimney/dsl/TransformerInto.scala @@ -211,6 +211,13 @@ final class TransformerInto[From, To, Overrides <: TransformerOverrides, Flags < )(using IsFunction.Of[Ctor, To]): TransformerInto[From, To, ? <: TransformerOverrides, Flags] = ${ TransformerIntoMacros.withConstructorImpl('this, 'f) } + /** Require that all fields of the source object except fields mentioned in `selectorFrom` are used in the + * transformation. and fail compilation otherwise. + * + * @param selectorFrom + * exception fields that are not required to be used in the transformation + * @return + */ transparent inline def requireSourceFieldsUsedExcept( inline selectorFrom: From => Any* ): TransformerInto[From, To, ? <: TransformerOverrides, Flags] = diff --git a/docs/docs/supported-transformations.md b/docs/docs/supported-transformations.md index 74318232a..5eadfe681 100644 --- a/docs/docs/supported-transformations.md +++ b/docs/docs/supported-transformations.md @@ -19,7 +19,8 @@ from the types, you need to provide it with a hint, but nothing more. While looking at code examples you're going to see these 2 terms: **Total Transformers** and **Partial Transformers**. -Chimney's job is to generate the code that will convert the value of one type (often called a **source** type, or `From`) +Chimney's job is to generate the code that will convert the value of one type (often called a **source** type, or +`From`) into another type (often called a **target** type, or `To`). When Chimney has enough information to generate the transformation, most of the time it could do it for **every** value of the source type. In Chimney, we called such transformations Total (because they are virtually **total functions**). One way in which Chimney allows you to use such @@ -138,13 +139,13 @@ function was not defined, "empty value" when something was expected) and even th As you can see `partial.Result` contains `Iterable` as a structure for holding its errors. Thanks to that: - - errors are aggregated - after partial transformation you have access to all failures that happened so that you - could fix them at once, rather than rerunning the transformation several times +- errors are aggregated - after partial transformation you have access to all failures that happened so that you + could fix them at once, rather than rerunning the transformation several times - you can turn this off with a runtime flag, just call `.transformIntoPartial[To](failFast = true)` - - errors are lazy - if their computation is expensive and they aren't used, you are not paying for it - - there are some build-in conversions from `partial.Result` (e.g. to `Option` or `Either`), and there are - [conversions to Cats types](cookbook.md#cats-integration), but you encouraged to convert them yourself - to whatever data format you use to represents errors +- errors are lazy - if their computation is expensive and they aren't used, you are not paying for it +- there are some build-in conversions from `partial.Result` (e.g. to `Option` or `Either`), and there are + [conversions to Cats types](cookbook.md#cats-integration), but you encouraged to convert them yourself + to whatever data format you use to represents errors !!! tip @@ -228,7 +229,7 @@ Other times you might need to convert `PartialFunction` into total function with However, the most common case would be where you would have to use one of utilities provided in `partial.Result`: !!! example - + ```scala //> using dep io.scalaland::chimney::{{ chimney_version() }} //> using dep com.lihaoyi::pprint::{{ libraries.pprint }} @@ -262,7 +263,7 @@ However, the most common case would be where you would have to use one of utilit If you are converting from: `Option`s, `Either[String, A]` or `Try` you can use an extension method: !!! example - + ```scala //> using dep io.scalaland::chimney::{{ chimney_version() }} //> using dep com.lihaoyi::pprint::{{ libraries.pprint }} @@ -479,7 +480,8 @@ be resolved recursively: ``` As we see, for infallible transformations there is very little difference in behavior between Total and Partial -Transformers. For "products" the difference shows up when transformation for any field/constructor fails. One such fallible +Transformers. For "products" the difference shows up when transformation for any field/constructor fails. One such +fallible transformation, available only in partial transformers, is unwrapping `Option` fields. !!! example @@ -564,9 +566,9 @@ side effects - you need to enable the `.enableMethodAccessors` flag: Flag `.enableMethodAccessors` will allow macros to consider methods that are: - - nullary (take 0 value arguments) - - have no type parameters - - cannot be considered Bean getters +- nullary (take 0 value arguments) +- have no type parameters +- cannot be considered Bean getters If the flag was enabled in the implicit config it can be disabled with `.disableMethodAccessors`. @@ -679,7 +681,7 @@ If the flag was enabled in the implicit config it can be disabled with `.enableI ### Reading from Bean getters If we want to read `def getFieldName(): A` as if it was `val fieldName: A` - which would allow reading from Java Beans -(or Plain Old Java Objects) - you need to enable a flag: +(or Plain Old Java Objects) - you need to enable a flag: !!! example @@ -721,11 +723,11 @@ If we want to read `def getFieldName(): A` as if it was `val fieldName: A` - whi Flag `.enableBeanGetters` will allow macros to consider methods which are: - - nullary (take 0 value arguments) - - have no type parameters - - have names starting with `get` - for comparison `get` will be dropped and the first remaining letter lowercased or - - have names starting with `is` and returning `Boolean` - for comparison `is` will be dropped and the first remaining - letter lowercased +- nullary (take 0 value arguments) +- have no type parameters +- have names starting with `get` - for comparison `get` will be dropped and the first remaining letter lowercased or +- have names starting with `is` and returning `Boolean` - for comparison `is` will be dropped and the first remaining + letter lowercased which would otherwise be ignored when analyzing possible sources of values. @@ -759,11 +761,47 @@ If the flag was enabled in the implicit config it can be disabled with `.disable // Consult https://chimney.readthedocs.io for usage examples. ``` +### Require source fields to be used + +If you want to enforce that every field of the source type is used in the transformation, you can enable the +`.requireSourceFieldsUsedExcept` setting. This setting also allows you to specify a certain subset of fields to be +exempt from this requirement. + +!!! example + + ```scala + //> using dep io.scalaland::chimney::{{ chimney_version() }} + import io.scalaland.chimney.dsl._ + + case class Source(a: String, b: Int, c: String) + case class Target(a: String) + + Source("value", 512, "anotherValue") + .into[Target] + .requireSourceFieldsUsedExcept() + .transform + // Chimney can't derive transformation from Source to Target + // + // Target + // field(s) b, c of Source are required to be used in the transformation but are not used! + // + // Consult https://chimney.readthedocs.io for usage examples. + + pprint.pprintln( + Source("value", 512, "anotherValue") + .into[Target] + .requireSourceFieldsUsedExcept(_.b, _.c) + .transform + ) + // expected output: + // Target(a = "value") + ``` + ### Writing to Bean setters If we want to write to `def setFieldName(fieldName: A): Unit` as if it was `fieldName: A` argument of a constructor - which would allow creating from Java Beans (or Plain Old Java Objects) - you need to enable the `.enableBeanSetters` -flag: +flag: !!! example @@ -819,10 +857,10 @@ flag: Flag `.enableBeanSetters` will allow macros to write to methods which are: - - unary (take 1 value argument) - - have no type parameters - - have names starting with `set` - for comparison `set` will be dropped and the first remaining letter lowercased - - returning `Unit` (this condition can be turned off) +- unary (take 1 value argument) +- have no type parameters +- have names starting with `set` - for comparison `set` will be dropped and the first remaining letter lowercased +- returning `Unit` (this condition can be turned off) _besides_ calling constructor (so you can pass values to _both_ the constructor and setters at once). Without the flag macro will fail compilation to avoid creating potentially uninitialized objects. @@ -937,9 +975,9 @@ If the flag was enabled in the implicit config it can be disabled with `.disable This flag can be combined with [`.enableBeanSetters`](#writing-to-bean-setters), so that: - - setters will attempt to be matched with fields from source - - setters could be overridden manually using `.withField*` methods - - those setters which would have no matching fields nor overrides would just be ignored +- setters will attempt to be matched with fields from source +- setters could be overridden manually using `.withField*` methods +- those setters which would have no matching fields nor overrides would just be ignored making this setting sort of a setters' counterpart to a default value in a constructor. @@ -1177,7 +1215,7 @@ yet have this concept). // Right(value = Target(value = SomeObject)) ``` -On Scala 3, parameterless `case` can be used as well: +On Scala 3, parameterless `case` can be used as well: !!! example @@ -1221,7 +1259,7 @@ On Scala 3, parameterless `case` can be used as well: Only `case object`s and parameterless `case`s are supported this way - other `object`s, or singletons defined for `value.type` are not supported at the moment. - + !!! notice `None.type` is explicitly excluded from this support as it might accidentally fill the value that should not be @@ -1261,13 +1299,13 @@ to default values with the `.enableDefaultValues` flag: A default value is used as a fallback, meaning: - - it has to be defined (and enabled with a flag) - - it will not be used if you provided value manually with one of the methods below - then the value provision always - succeeds - - it will not be used if a source field (`val`) or a method (enabled with one of the flags above) with a matching name - could be found - if a source value type can be converted into a target argument/setter type then the value provision - succeeds, but if Chimney fails to convert the value then the whole derivation fails rather than falls back to - the default value +- it has to be defined (and enabled with a flag) +- it will not be used if you provided value manually with one of the methods below - then the value provision always + succeeds +- it will not be used if a source field (`val`) or a method (enabled with one of the flags above) with a matching name + could be found - if a source value type can be converted into a target argument/setter type then the value provision + succeeds, but if Chimney fails to convert the value then the whole derivation fails rather than falls back to + the default value If the flag was enabled in the implicit config it can be disabled with `.disableDefaultValues`. @@ -1328,7 +1366,7 @@ default values of one particular type: Sometimes we transform value into a type that would use `Option`'s `None` to handle some default behavior and `Some` as the user's overrides. This type might not have a default value (e.g. `value: Option[A] = None`) in its constructor, but we would find it useful to fall back on `None` in such cases. It is not enabled out of the box, for -similar reasons to default values support, but we can enable it with the `.enableOptionDefaultsToNone` flag: +similar reasons to default values support, but we can enable it with the `.enableOptionDefaultsToNone` flag: !!! example @@ -1369,15 +1407,16 @@ similar reasons to default values support, but we can enable it with the `.enabl The `None` value is used as a fallback, meaning: - - it has to be enabled with a flag - - it will not be used if you provided value manually with one of the `.with*` methods - then the value provision - always succeeds - - it will not be used if a source field (`val`) or a method (enabled with one of the flags above) with a matching name - could be found - if a source value type can be converted into a target argument/setter type then the value provision - succeeds, but if Chimney fails to convert the value then the whole derivation fails rather than falls back to - the `None` value - - it will not be used if a default value is present and [the support for default values has been enabled](#allowing-fallback-to-the-constructors-default-values) - (the fallback to `None` has a lower priority than the fallback to a default value) +- it has to be enabled with a flag +- it will not be used if you provided value manually with one of the `.with*` methods - then the value provision + always succeeds +- it will not be used if a source field (`val`) or a method (enabled with one of the flags above) with a matching name + could be found - if a source value type can be converted into a target argument/setter type then the value provision + succeeds, but if Chimney fails to convert the value then the whole derivation fails rather than falls back to + the `None` value +- it will not be used if a default value is present + and [the support for default values has been enabled](#allowing-fallback-to-the-constructors-default-values) + (the fallback to `None` has a lower priority than the fallback to a default value) !!! example @@ -1484,21 +1523,21 @@ it with another field. Since the usual cause of such cases is a _rename_, we can The requirements to use a rename are as follows: - - you have to pass `_.fieldName` directly, it cannot be done with a reference to the function - - you can only use `val`/nullary method/Bean getter as a source field name - - you have to have a `val`/nullary method/Bean getter with a name matching constructor's argument (or Bean setter if - setters are enabled) to point which argument you are targeting - - the field rename can be _nested_, you can pass `_.foo.bar.baz` there, and on the constructor's arguments side - additionally you can use: - - `.matching[Subtype]` to select just one subtype of ADT e.g `_.adt.matching[Subtype].subtypeField` (do not use for - matching on `Option` or `Either`! Use dedicated matchers described below) - - `.matchingSome` to select values inside `Option` e.g. `_.option.matchingSome.field` - - `.matchingLeft` and `.matchingRight` to select values inside `Either` e.g. `_.either.matchingLeft.field` or - `_.either.matchingRight.field` - - `.everyItem` to select items inside collection or array e.g. `_.list.everyItem.field`, `_.array.everyItem.field` - - `.everyMapKey` and `.everyMapValue` to select keys/values inside maps e.g. `_.map.everyMapKey.field`, - `_.map.everyMapValue.field` - +- you have to pass `_.fieldName` directly, it cannot be done with a reference to the function +- you can only use `val`/nullary method/Bean getter as a source field name +- you have to have a `val`/nullary method/Bean getter with a name matching constructor's argument (or Bean setter if + setters are enabled) to point which argument you are targeting +- the field rename can be _nested_, you can pass `_.foo.bar.baz` there, and on the constructor's arguments side + additionally you can use: + - `.matching[Subtype]` to select just one subtype of ADT e.g `_.adt.matching[Subtype].subtypeField` (do not use for + matching on `Option` or `Either`! Use dedicated matchers described below) + - `.matchingSome` to select values inside `Option` e.g. `_.option.matchingSome.field` + - `.matchingLeft` and `.matchingRight` to select values inside `Either` e.g. `_.either.matchingLeft.field` or + `_.either.matchingRight.field` + - `.everyItem` to select items inside collection or array e.g. `_.list.everyItem.field`, `_.array.everyItem.field` + - `.everyMapKey` and `.everyMapValue` to select keys/values inside maps e.g. `_.map.everyMapKey.field`, + `_.map.everyMapValue.field` + The last 2 conditions are always met when working with `case class`es with no `private val`s in constructor, and classes with all arguments declared as public `val`s, and Java Beans where each setter has a corresponding getter defined. @@ -1584,10 +1623,10 @@ We are also able to rename fields in nested structure: ### Wiring the constructor's parameter to a provided value -Another way of handling the missing source field - or overriding an existing one - is providing the value for +Another way of handling the missing source field - or overriding an existing one - is providing the value for the constructor's argument/setter yourself. The successful value can be provided using `.withFieldConst`: -!!! example +!!! example ```scala //> using dep io.scalaland::chimney::{{ chimney_version() }} @@ -1622,13 +1661,13 @@ the constructor's argument/setter yourself. The successful value can be provided `.withFieldConst` can be used to provide/override only _successful_ values. What if we want to provide a failure, e.g.: - - a `String` with an error message - - an `Exception` - - or a notion of the empty value? +- a `String` with an error message +- an `Exception` +- or a notion of the empty value? These cases can be handled only with `PartialTransformer` using `.withFieldConstPartial`: -!!! example +!!! example ```scala //> using dep io.scalaland::chimney::{{ chimney_version() }} @@ -1697,18 +1736,18 @@ As you can see, the transformed value will automatically preserve the field name The requirements to use a value provision are as follows: - - you have to pass `_.fieldName` directly, it cannot be done with a reference to the function - - you have to have a `val`/nullary method/Bean getter with a name matching constructor's argument (or Bean setter if - setters are enabled) - - the path can be _nested_, you can pass `_.foo.bar.baz` there, and additionally you can use: - - `.matching[Subtype]` to select just one subtype of ADT e.g `_.adt.matching[Subtype].subtypeField` (do not use for - matching on `Option` or `Either`! Use dedicated matchers described below) - - `.matchingSome` to select values inside `Option` e.g. `_.option.matchingSome.field` - - `.matchingLeft` and `.matchingRight` to select values inside `Either` e.g. `_.either.matchingLeft.field` or - `_.either.matchingRight.field` - - `.everyItem` to select items inside collection or array e.g. `_.list.everyItem.field`, `_.array.everyItem.field` - - `.everyMapKey` and `.everyMapValue` to select keys/values inside maps e.g. `_.map.everyMapKey.field`, - `_.map.everyMapValue.field` +- you have to pass `_.fieldName` directly, it cannot be done with a reference to the function +- you have to have a `val`/nullary method/Bean getter with a name matching constructor's argument (or Bean setter if + setters are enabled) +- the path can be _nested_, you can pass `_.foo.bar.baz` there, and additionally you can use: + - `.matching[Subtype]` to select just one subtype of ADT e.g `_.adt.matching[Subtype].subtypeField` (do not use for + matching on `Option` or `Either`! Use dedicated matchers described below) + - `.matchingSome` to select values inside `Option` e.g. `_.option.matchingSome.field` + - `.matchingLeft` and `.matchingRight` to select values inside `Either` e.g. `_.either.matchingLeft.field` or + `_.either.matchingRight.field` + - `.everyItem` to select items inside collection or array e.g. `_.list.everyItem.field`, `_.array.everyItem.field` + - `.everyMapKey` and `.everyMapValue` to select keys/values inside maps e.g. `_.map.everyMapKey.field`, + `_.map.everyMapValue.field` The second condition is always met when working with `case class`es with no `private val`s in constructor, and classes with all arguments declared as public `val`s, and Java Beans where each setter has a corresponding getter defined. @@ -1762,7 +1801,7 @@ with all arguments declared as public `val`s, and Java Beans where each setter h We are also able to provide values in nested structure: -!!! example +!!! example ```scala //> using dep io.scalaland::chimney::{{ chimney_version() }} @@ -1799,11 +1838,12 @@ We are also able to provide values in nested structure: ### Wiring the constructor's parameter to the computed value -Yet another way of handling the missing source field - or overriding an existing one - is computing the value for -the constructor's argument/setter out from a whole transformed value. The always-succeeding transformation can be provided +Yet another way of handling the missing source field - or overriding an existing one - is computing the value for +the constructor's argument/setter out from a whole transformed value. The always-succeeding transformation can be +provided using `.withFieldComputed`: -!!! example +!!! example ```scala //> using dep io.scalaland::chimney::{{ chimney_version() }} @@ -1847,13 +1887,13 @@ using `.withFieldComputed`: `.withFieldComputed` can be used to compute only _successful_ values. What if we want to provide a failure, e.g.: - - a `String` with an error message - - an `Exception` - - or a notion of the empty value? +- a `String` with an error message +- an `Exception` +- or a notion of the empty value? These cases can be handled only with `PartialTransformer` using `.withFieldComputedPartial`: -!!! example +!!! example ```scala //> using dep io.scalaland::chimney::{{ chimney_version() }} @@ -1929,18 +1969,18 @@ As you can see, the transformed value will automatically preserve the field name The requirements to use a value computation are as follows: - - you have to pass `_.fieldName` directly, it cannot be done with a reference to the function - - you have to have a `val`/nullary method/Bean getter with a name matching constructor's argument (or Bean setter if - setters are enabled) - - the path can be _nested_, you can pass `_.foo.bar.baz` there, and additionally you can use: - - `.matching[Subtype]` to select just one subtype of ADT e.g `_.adt.matching[Subtype].subtypeField` (do not use for - matching on `Option` or `Either`! Use dedicated matchers described below) - - `.matchingSome` to select values inside `Option` e.g. `_.option.matchingSome.field` - - `.matchingLeft` and `.matchingRight` to select values inside `Either` e.g. `_.either.matchingLeft.field` or - `_.either.matchingRight.field` - - `.everyItem` to select items inside collection or array e.g. `_.list.everyItem.field`, `_.array.everyItem.field` - - `.everyMapKey` and `.everyMapValue` to select keys/values inside maps e.g. `_.map.everyMapKey.field`, - `_.map.everyMapValue.field` +- you have to pass `_.fieldName` directly, it cannot be done with a reference to the function +- you have to have a `val`/nullary method/Bean getter with a name matching constructor's argument (or Bean setter if + setters are enabled) +- the path can be _nested_, you can pass `_.foo.bar.baz` there, and additionally you can use: + - `.matching[Subtype]` to select just one subtype of ADT e.g `_.adt.matching[Subtype].subtypeField` (do not use for + matching on `Option` or `Either`! Use dedicated matchers described below) + - `.matchingSome` to select values inside `Option` e.g. `_.option.matchingSome.field` + - `.matchingLeft` and `.matchingRight` to select values inside `Either` e.g. `_.either.matchingLeft.field` or + `_.either.matchingRight.field` + - `.everyItem` to select items inside collection or array e.g. `_.list.everyItem.field`, `_.array.everyItem.field` + - `.everyMapKey` and `.everyMapValue` to select keys/values inside maps e.g. `_.map.everyMapKey.field`, + `_.map.everyMapValue.field` The second condition is always met when working with `case class`es with no `private val`s in constructor, and classes with all arguments declared as public `val`s, and Java Beans where each setter has a corresponding getter defined. @@ -1994,7 +2034,7 @@ with all arguments declared as public `val`s, and Java Beans where each setter h We are also able to compute values in nested structure: -!!! example +!!! example ```scala //> using dep io.scalaland::chimney::{{ chimney_version() }} @@ -2035,7 +2075,7 @@ We are also able to compute values in nested structure: Be default names are matched in a Java-Bean-aware way - `fieldName` would be considered a match for another `fieldName` but also for `isFieldName`, `getFieldName` and `setFieldName`. This allows the macro to read both normal `val`s and Bean getters and write into constructor arguments and Bean setters. (Whether such getters/setters would we admitted -for matching is controlled by dedicated flags: [`.enableBeanGetters`](#reading-from-bean-getters) and +for matching is controlled by dedicated flags: [`.enableBeanGetters`](#reading-from-bean-getters) and [`.enableBeanSetters`](#writing-to-bean-setters)). The field name matching predicate can be overridden with a flag: @@ -2111,7 +2151,8 @@ The field name matching predicate can be overridden with a flag: } ``` -For details about `TransformedNamesComparison` look at [their dedicated section](#defining-custom-name-matching-predicate). +For details about `TransformedNamesComparison` look +at [their dedicated section](#defining-custom-name-matching-predicate). !!! warning @@ -2343,7 +2384,7 @@ a flag: // Right(value = UserName(value = "user name")) } ``` - + If the flag was enabled in the implicit config it can be disabled with `.disbleNonAnyValWrappers`. !!! example @@ -2472,7 +2513,7 @@ It works also with Scala 3's `enum`: // Buzz } ``` - + !!! example `enum` into `sealed trait` @@ -2505,7 +2546,7 @@ It works also with Scala 3's `enum`: // Buzz } ``` - + !!! example `enum` into `enum` @@ -3221,7 +3262,8 @@ The subtype name matching predicate can be overridden with a flag: } ``` -For details about `TransformedNamesComparison` look at [their dedicated section](#defining-custom-name-matching-predicate). +For details about `TransformedNamesComparison` look +at [their dedicated section](#defining-custom-name-matching-predicate). !!! warning @@ -3433,7 +3475,7 @@ automatically only with `PartialTransformer`: If you need to provide support for your optional types, please, read about [custom optional types](cookbook.md#custom-optional-types). - + ### Controlling automatic `Option` unwrapping Automatic unwrapping of `Option`s by `PartialTransformer`s allows for seamless decoding of many PTO types into domain @@ -3588,7 +3630,7 @@ Every `Array`/every collection provided with `scala.collection.compat.Factory` c collection's transformation. The requirement for a collection's transformation is that both source's and target's conditions are met and that -the types stored within these collections can also be converted. +the types stored within these collections can also be converted. !!! example @@ -3709,7 +3751,7 @@ If the type is `abstract` and used as a value, but contains enough information t knows how to apply it, the transformation can still be derived: !!! example - + If Chimney knows that type can be safely upcasted, the upcasting is available to it: ```scala @@ -3775,7 +3817,7 @@ knows how to apply it, the transformation can still be derived: Finally, you can always provide a custom `Transformer` from/to a type containing a type parameter, as an `implicit`: !!! example - + ```scala //> using dep io.scalaland::chimney::{{ chimney_version() }} import io.scalaland.chimney.dsl._ @@ -3805,7 +3847,7 @@ If the target is one of supported singleton types, we can provide the transforma Scala 2.13 and 3 allow using [literal-based singleton types](https://docs.scala-lang.org/sips/42.type.html): !!! example - + ```scala //> using dep io.scalaland::chimney::{{ chimney_version() }} //> using dep com.lihaoyi::pprint::{{ libraries.pprint }} @@ -3919,7 +3961,7 @@ When the target is a `case class`, the transformation can always be provided: // Right(value = SomeObject) ``` -On Scala 3, parameterless `case` can be used as well: +On Scala 3, parameterless `case` can be used as well: !!! example @@ -3967,16 +4009,16 @@ On Scala 3, parameterless `case` can be used as well: If you cannot use a public primary constructor to create the target type, is NOT a Scala collection, `Option`, `AnyVal`, ... but is e.g.: - - a type using a smart constructor - - a type which has multiple constructors and you need to point which one you want to use - - abstract type defined next to an abstract method that will instantiate it - - non-`sealed` `trait` where you want to pick one particular implementation for your transformation +- a type using a smart constructor +- a type which has multiple constructors and you need to point which one you want to use +- abstract type defined next to an abstract method that will instantiate it +- non-`sealed` `trait` where you want to pick one particular implementation for your transformation AND you do know a way of constructing this type using a method - or handwritten lambda - you can point to that method. -Then Chimney will try to match the source type's getters against the method's parameters by their names: +Then Chimney will try to match the source type's getters against the method's parameters by their names: !!! example - + ```scala //> using dep io.scalaland::chimney::{{ chimney_version() }} //> using dep com.lihaoyi::pprint::{{ libraries.pprint }} @@ -4018,7 +4060,7 @@ If your type only has smart a constructor which e.g. validates the input and mig constructor for `PartialTransformer`: !!! example - + ```scala //> using dep io.scalaland::chimney::{{ chimney_version() }} //> using dep com.lihaoyi::pprint::{{ libraries.pprint }} @@ -4072,10 +4114,10 @@ constructor for `PartialTransformer`: You can use this to automatically match the source's getters e.g. against smart constructor's arguments - these types would almost always have methods which the user could recognize as constructor's but which might be difficult -to be automatically recognized as such: +to be automatically recognized as such: !!! example - + Due to the nature of `opaque type`s this example needs to have opaque types defined in a different `.scala` file than where they are being used: @@ -4193,7 +4235,7 @@ to be automatically recognized as such: } ``` -!!! tip +!!! tip `opaque type`s usually have only one constructor argument, and usually it is easier to not transform them that way, but rather call their constructor directly. If `opaque type`s are nested in the transformed structure, it might be @@ -4404,12 +4446,12 @@ when Chimney match subtypes by name, you can tell it how to convert them using i // A(int = "10") // B ``` - + However, usually it is easier to provide it via [an override](#handling-a-specific-sealed-subtype-with-a-computed-value) instead. !!! warning - + There also exist a special fallback rule for `sealed`/`enum` allowing to use a source's subtype to the whole target type: @@ -4455,8 +4497,8 @@ instead. When you use Partial Transformers Chimney will try to: - - summon the user-provided implicit - either `PartialTransformer` or `Transformer` - - derive `PartialTransformer` +- summon the user-provided implicit - either `PartialTransformer` or `Transformer` +- derive `PartialTransformer` Under normal circumstances infallible transformation would be defined as `Transformer` and `PartialTransformer`s would still be able to use it, so there is hardly ever the need for 2 instances for the same types. @@ -4548,20 +4590,20 @@ The Chimney does not decide and in the presence of 2 implicits it will fail and When Chimney derives transformation it is a recursive process: - - for each `class` into `case class`/POJO it will attempt recursion to find a mapping from the source field to - the target constructor's argument/setter - - for `sealed`/`enum`s it will attempt to convert each `case` pair recursively - - for `AnyVal`s it will attempt to resolve mappings between inner values - - for `Option`s and `Either`s and collections it will attempt to resolve mappings of the element types +- for each `class` into `case class`/POJO it will attempt recursion to find a mapping from the source field to + the target constructor's argument/setter +- for `sealed`/`enum`s it will attempt to convert each `case` pair recursively +- for `AnyVal`s it will attempt to resolve mappings between inner values +- for `Option`s and `Either`s and collections it will attempt to resolve mappings of the element types etc. The conditions for terminating the recursion are: - - a failure to find a supported conversion (for every supported case at least one condition wasn't met, and users - haven't provided their own via implicits) - - the finding of user-provided `implicit` which handles the transformation between resolved types - - proving that the source type is a subtype of the target type, so we can just upcast it. +- a failure to find a supported conversion (for every supported case at least one condition wasn't met, and users + haven't provided their own via implicits) +- the finding of user-provided `implicit` which handles the transformation between resolved types +- proving that the source type is a subtype of the target type, so we can just upcast it. ### Recursive data types @@ -4634,18 +4676,18 @@ If we need to customize it, we can use `.define.buildTransformer`: Arguments taken by both `.enableCustomFieldNameComparison` and `.enableCustomSubtypeNameComparison` are values of type `TransformedNamesComparison`. Out of the box, Chimney provides: - - `TransformedNamesComparison.StrictEquality` - 2 names are considered equal only if they are identical `String`s. - This is the default matching strategy for subtype names comparison - - `TransformedNamesComparison.BeanAware` - 2 names are considered equal if they are identical `String`s OR if they are - identical after you convert them from Java Bean naming convention: +- `TransformedNamesComparison.StrictEquality` - 2 names are considered equal only if they are identical `String`s. + This is the default matching strategy for subtype names comparison +- `TransformedNamesComparison.BeanAware` - 2 names are considered equal if they are identical `String`s OR if they are + identical after you convert them from Java Bean naming convention: - if a name starts with `is`/`get`/`set` prefix (e.g. `isField`, `getField`, `setField`) then - strip this name from the prefix (obtaining e.g. `Field`) and - lower case the first letter (obtaining e.g. `field`) - - - `TransformedNamesComparison.CaseInsensitiveEquality` - 2 names are considered equal if `equalsIgnoreCase` returns + +- `TransformedNamesComparison.CaseInsensitiveEquality` - 2 names are considered equal if `equalsIgnoreCase` returns `true` -However, these 3 do not exhaust all possible comparisons and you might need to provide one yourself. +However, these 3 do not exhaust all possible comparisons and you might need to provide one yourself. !!! warning @@ -4655,12 +4697,12 @@ The challenge is that the function you'd like to provide has to be called within a way that the macro will be able to access it. Normally, there is no way to inject a custom login into existing macro, but Chimney has a specific solution for this: - - you need to define your `TransformedNamesComparison` as `object` - objects do not need constructor arguments, so - they can be instantiated easily - - your have to define this `object` as top-level definition or within another object - object defined within a `class`, - a `trait` or locally, does need some logic for instantiation - - you have to define your `object` in a module/subproject that is compiled _before_ the module where you need to use - it, so that the bytecode would already be accessible on the classpath. +- you need to define your `TransformedNamesComparison` as `object` - objects do not need constructor arguments, so + they can be instantiated easily +- your have to define this `object` as top-level definition or within another object - object defined within a `class`, + a `trait` or locally, does need some logic for instantiation +- you have to define your `object` in a module/subproject that is compiled _before_ the module where you need to use + it, so that the bytecode would already be accessible on the classpath. !!! example