This section also applies to Agent
, Option
, Trigger
, Tool
, Parameter
, Trigger
, and When
,
but the implementation for those is a little bit simpler
When implementing a step, there are two pieces of work you have to do:
-
create a class that extends
com.code42.jenkins.pipelinekt.core.step.Step
- You can and should combine
com.code42.jenkins.pipelinekt.core.step.SingletonStep
orcom.code42.jenkins.pipelinekt.core.step.NestedStep
- with one of
com.code42.jenkins.pipelinekt.core.step.DeclarativeStep
orcom.code42.jenkins.pipelinekt.core.step.ScriptedStep
- for example:
data class MyNewStep(...) : SingletonStep, DeclarativeStep //or data class MyStepWithANestedStep(..., override val steps: Step): NestedStep, DeclarativeStep
SingletonStep
is for steps that don't contain more steps likesh
orstash
NestedStep
is for steps contain other steps, likewithCredentials
orwithEnv
ornode
DeclarativeStep
andScriptedStep
are for determining whether a step runs inside of the declarative or scripted context.- some steps do not support the declarative jenkinsfile context
- See the documentation
ScriptedStep
can be used to implement custom code blocks as steps
- if extending
DeclarativeStep
implementtoGroovy
, if extendingScriptedStep
implementscriptedGroovy
- If adding to pipelinekt, add to the
internal
module, otherwise just put it in your project somewhere- These are not meant to be exposed to end users. End users should exclusively use the
dsl
. thedsl
module will not expose classes frominternal
to users.
- These are not meant to be exposed to end users. End users should exclusively use the
- You can and should combine
-
Implementing a DSL method
-
Create the dsl methods
- simple example
fun DslContext<Step>.myNewStep(...) { add(MyNewStep(...)) } fun DslContext<Step>.myStepWithANestedStep(..., steps: DslContext<Step>.() -> Unit) { add(MyStepWithANestedStep(..., DslContext.into(steps).toStep())) }
-
This would expose a new step that can be used anywhere you can use a step, for example:
stage("my stage") {
steps {
myNewStep(args...)
myStepWithANestedStep(args...) {
sh("env")
}
}
}
See the existing examples:
- The class
com.code42.jenkins.pipelinekt.internal.step.Sh
is a good example of a simple step - it is a data class that can serialize itself to groovy com.code42.jenkins.pipelinekt.internal.step.WithCredentials
is a simple example of a nested step
The library also allows you to implement complex steps that may take in configuration. A good example of this is the
gradle build dsl
and the docker dsl
Lets take a look at part of the gradle build dsl:
data class GradleBuildDsl(
val gradleCredentials: UsernamePassword? = null,
val gradleUserProperty: String = "gradle.wrapperUser",
val gradlePasswordProperty: String = "gradle.wrapperPassword"
) {
...
fun DslContext<Step>.gradleCommand(command: String, additionalBuildArgs: Var.Literal.Str) =
withEnv(
mapOf("GRADLE_USER_HOME" to "${"WORKSPACE".environmentVar()}/.gradle-home-tmp",
"JENKINS_NODE_COOKIE" to "dontKillMe")
) { artifactoryAuthenticated {
sh(("./gradlew --stacktrace --build-cache " +
(gradleCredentials?.let { "-D$gradleUserProperty=\\\"\\\${${it.usernameVariable.value}}\\\" -D$gradlePasswordProperty=\\\"\\\${${it.passwordVariable.value}}\\\" " } ?: "") +
"$additionalBuildArgs $command").strDouble())
} }
...
Gradle build defines a public method that is an extension method, but we can't call this method with just the dsl,
however we can run this method inside of our dsk script if we wrap it with a call to gradleBuildDsl.run { ... }
:
val gradle = GradleBuildDsl()
/**
* Pipeline
*/
fun PipelineDsl.gradleBuildPipeline() =
gradle.run {
pipeline {
stages {
stage("Build") {
steps {
gradleCommand("build $gradleArgs")
}
}
stage("Publish") {
steps {
gradleCommand("publish $gradleArgs")
}
}
}
}
}
By wrapping the pipeline in gradle.run
, the method gradleCommand is exposed on DslContext, which means that
the gradleCommand
method now behaves like a custom step.
This gives the benefit of being able to inject some configuration into the dsl, for gradle the configuration is a credentialsId and username/password properties. This prevents users from having to repeat the same rather verbose sh calls all over our pipelines and allows us to build pipelines where we can easily swap out credentials.