Skip to content

Commit

Permalink
monitor: add support for log monitors (#83)
Browse files Browse the repository at this point in the history
  • Loading branch information
obs-gh-konstantintikhonov authored Dec 12, 2023
1 parent acb0033 commit bdbc37a
Show file tree
Hide file tree
Showing 16 changed files with 505 additions and 32 deletions.
8 changes: 8 additions & 0 deletions client/internal/meta/operation/monitor.graphql
Original file line number Diff line number Diff line change
Expand Up @@ -58,6 +58,14 @@ fragment Monitor on Monitor {
descriptionField
primaryKey
}
... on MonitorRuleLog {
compareFunction
compareValues
lookbackTime
expressionSummary
logStageId
sourceLogDatasetId
}
}

notificationSpec {
Expand Down
62 changes: 59 additions & 3 deletions client/meta/genqlient.generated.go

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

2 changes: 2 additions & 0 deletions docs/data-sources/dataset.md
Original file line number Diff line number Diff line change
Expand Up @@ -53,4 +53,6 @@ results of this stage.
the stage pipeline. It must refer to a label contained in `inputs`, or a
previous stage `alias`. The stage input can be omitted if `inputs`
contains a single element.
- `output_stage` (Boolean) A boolean flag used to specify the output stage. Should be used only for
a stage preceding the last stage. The last stage is an output stage by default.
- `pipeline` (String) An OPAL snippet defining a transformation on the selected input.
2 changes: 2 additions & 0 deletions docs/data-sources/monitor.md
Original file line number Diff line number Diff line change
Expand Up @@ -149,4 +149,6 @@ results of this stage.
the stage pipeline. It must refer to a label contained in `inputs`, or a
previous stage `alias`. The stage input can be omitted if `inputs`
contains a single element.
- `output_stage` (Boolean) A boolean flag used to specify the output stage. Should be used only for
a stage preceding the last stage. The last stage is an output stage by default.
- `pipeline` (String) An OPAL snippet defining a transformation on the selected input.
2 changes: 2 additions & 0 deletions docs/data-sources/query.md
Original file line number Diff line number Diff line change
Expand Up @@ -40,6 +40,8 @@ Optional:

- `alias` (String)
- `input` (String)
- `output_stage` (Boolean) A boolean flag used to specify the output stage. Should be used only for
a stage preceding the last stage. The last stage is an output stage by default.
- `pipeline` (String)


Expand Down
2 changes: 2 additions & 0 deletions docs/resources/dataset.md
Original file line number Diff line number Diff line change
Expand Up @@ -76,6 +76,8 @@ results of this stage.
the stage pipeline. It must refer to a label contained in `inputs`, or a
previous stage `alias`. The stage input can be omitted if `inputs`
contains a single element.
- `output_stage` (Boolean) A boolean flag used to specify the output stage. Should be used only for
a stage preceding the last stage. The last stage is an output stage by default.
- `pipeline` (String) An OPAL snippet defining a transformation on the selected input.
## Import
Import is supported using the following syntax:
Expand Down
19 changes: 19 additions & 0 deletions docs/resources/monitor.md
Original file line number Diff line number Diff line change
Expand Up @@ -50,6 +50,7 @@ Optional:
- `count` (Block List, Max: 1) (see [below for nested schema](#nestedblock--rule--count))
- `facet` (Block List, Max: 1) (see [below for nested schema](#nestedblock--rule--facet))
- `group_by_group` (Block List) (see [below for nested schema](#nestedblock--rule--group_by_group))
- `log` (Block List, Max: 1) (see [below for nested schema](#nestedblock--rule--log))
- `promote` (Block List, Max: 1) (see [below for nested schema](#nestedblock--rule--promote))
- `source_column` (String)
- `threshold` (Block List, Max: 1) (see [below for nested schema](#nestedblock--rule--threshold))
Expand Down Expand Up @@ -109,6 +110,22 @@ Optional:
- `group_name` (String)


<a id="nestedblock--rule--log"></a>
### Nested Schema for `rule.log`

Required:

- `compare_function` (String)
- `lookback_time` (String)

Optional:

- `compare_values` (List of Number)
- `expression_summary` (String) Short summary or comment of how the data for monitor is queried.
- `log_stage_id` (String) An id of the stage that is used to generate logs for preview. This is usually a stage before aggregation.
- `source_log_dataset` (String) ID of the dataset that contains logs for preview.


<a id="nestedblock--rule--promote"></a>
### Nested Schema for `rule.promote`

Expand Down Expand Up @@ -148,6 +165,8 @@ results of this stage.
the stage pipeline. It must refer to a label contained in `inputs`, or a
previous stage `alias`. The stage input can be omitted if `inputs`
contains a single element.
- `output_stage` (Boolean) A boolean flag used to specify the output stage. Should be used only for
a stage preceding the last stage. The last stage is an output stage by default.
- `pipeline` (String) An OPAL snippet defining a transformation on the selected input.


Expand Down
5 changes: 5 additions & 0 deletions observe/data_source_dataset.go
Original file line number Diff line number Diff line change
Expand Up @@ -105,6 +105,11 @@ func dataSourceDataset() *schema.Resource {
Computed: true,
Description: descriptions.Get("transform", "schema", "stage", "pipeline"),
},
"output_stage": {
Type: schema.TypeBool,
Computed: true,
Description: descriptions.Get("transform", "schema", "stage", "output_stage"),
},
},
},
},
Expand Down
5 changes: 5 additions & 0 deletions observe/data_source_monitor.go
Original file line number Diff line number Diff line change
Expand Up @@ -103,6 +103,11 @@ func dataSourceMonitor() *schema.Resource {
Computed: true,
Description: descriptions.Get("transform", "schema", "stage", "pipeline"),
},
"output_stage": {
Type: schema.TypeBool,
Computed: true,
Description: descriptions.Get("transform", "schema", "stage", "output_stage"),
},
},
},
},
Expand Down
50 changes: 43 additions & 7 deletions observe/data_source_query.go
Original file line number Diff line number Diff line change
Expand Up @@ -13,6 +13,7 @@ import (
gql "github.com/observeinc/terraform-provider-observe/client/meta"
"github.com/observeinc/terraform-provider-observe/client/meta/types"
"github.com/observeinc/terraform-provider-observe/client/oid"
"github.com/observeinc/terraform-provider-observe/observe/descriptions"
)

func dataSourceQuery() *schema.Resource {
Expand Down Expand Up @@ -69,6 +70,12 @@ func dataSourceQuery() *schema.Resource {
Type: schema.TypeString,
Optional: true,
},
"output_stage": {
Type: schema.TypeBool,
Default: false,
Optional: true,
Description: descriptions.Get("transform", "schema", "stage", "output_stage"),
},
},
},
},
Expand Down Expand Up @@ -122,8 +129,9 @@ func dataSourceQuery() *schema.Resource {
}

type Query struct {
Inputs map[string]*Input `json:"inputs"`
Stages []*Stage `json:"stages"`
Inputs map[string]*Input `json:"inputs"`
Stages []*Stage `json:"stages"`
StageIds []string `json:"stage_ids"`
}

// Stage applies a pipeline to an input
Expand All @@ -132,16 +140,27 @@ type Query struct {
// Internally, the alias does not map to the stageID - it is the input name we
// use when refering to this stage
type Stage struct {
Alias *string `json:"alias,omitempty"`
Input *string `json:"input,omitempty"`
Pipeline string `json:"pipeline"`
Alias *string `json:"alias,omitempty"`
Input *string `json:"input,omitempty"`
Pipeline string `json:"pipeline"`
OutputStage bool `json:"outputStage"`
}

// Input references an existing data source
type Input struct {
Dataset *string ` json:"dataset,omitempty"`
}

func getOutputStagesCount(stages []Stage) int {
c := 0
for _, s := range stages {
if s.OutputStage {
c += 1
}
}
return c
}

func newQuery(data *schema.ResourceData) (*gql.MultiStageQueryInput, diag.Diagnostics) {
inputIds := make(map[string]string)
for k, v := range data.Get("inputs").(map[string]interface{}) {
Expand All @@ -166,9 +185,18 @@ func newQuery(data *schema.ResourceData) (*gql.MultiStageQueryInput, diag.Diagno
if v, ok := data.GetOk(fmt.Sprintf("stage.%d.pipeline", i)); ok {
stage.Pipeline = v.(string)
}

if v, ok := data.GetOk(fmt.Sprintf("stage.%d.output_stage", i)); ok {
stage.OutputStage = v.(bool)
}
stages = append(stages, stage)
}

outputStagesCount := getOutputStagesCount(stages)
if outputStagesCount > 1 {
return nil, diag.FromErr(errMoreThanOneOutputStages)
}

var sortedNames []string
inputs := make(map[string]*gql.InputDefinitionInput, len(inputIds))
for name, input := range inputIds {
Expand Down Expand Up @@ -243,7 +271,10 @@ func newQuery(data *schema.ResourceData) (*gql.MultiStageQueryInput, diag.Diagno

// stage is done, append to transform
query.Stages = append(query.Stages, stageInput)
query.OutputStage = stageInputId

if outputStagesCount == 0 || stage.OutputStage {
query.OutputStage = stageInputId
}

// prepare for next iteration of loop
// this stage will become defaultInput for the next
Expand Down Expand Up @@ -432,7 +463,7 @@ func dataSourceQueryRead(ctx context.Context, data *schema.ResourceData, meta in
// return diags
// }

func flattenQuery(gqlStages []*gql.StageQuery) (*Query, error) {
func flattenQuery(gqlStages []*gql.StageQuery, outputStage string) (*Query, error) {
query := &Query{Inputs: make(map[string]*Input)}

// first reconstruct all inputs
Expand Down Expand Up @@ -475,7 +506,12 @@ func flattenQuery(gqlStages []*gql.StageQuery) (*Query, error) {
stage.Input = &inputName
}

if outputStage != "" && outputStage == stageId && i < len(gqlStages)-1 {
stage.OutputStage = true
}

query.Stages = append(query.Stages, stage)
query.StageIds = append(query.StageIds, stageId)
}

return query, nil
Expand Down
8 changes: 8 additions & 0 deletions observe/descriptions/monitor.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -26,3 +26,11 @@ schema:
Whether notification reminders are enabled for this monitor. To enable them, set `reminder_frequency`.
notify_on_close: |
Enables a final update when a monitor notification is closed (no longer triggered).
rule:
log:
expression_summary: |
Short summary or comment of how the data for monitor is queried.
log_stage_id: |
An id of the stage that is used to generate logs for preview. This is usually a stage before aggregation.
source_log_dataset: |
ID of the dataset that contains logs for preview.
3 changes: 3 additions & 0 deletions observe/descriptions/transform.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -20,3 +20,6 @@ schema:
contains a single element.
pipeline: |
An OPAL snippet defining a transformation on the selected input.
output_stage: |
A boolean flag used to specify the output stage. Should be used only for
a stage preceding the last stage. The last stage is an output stage by default.
19 changes: 10 additions & 9 deletions observe/helpers.go
Original file line number Diff line number Diff line change
Expand Up @@ -24,15 +24,16 @@ import (
)

var (
errObjectIDInvalid = errors.New("object id is invalid")
errNameMissing = errors.New("name not set")
errInputsMissing = errors.New("no inputs defined")
errStagesMissing = errors.New("no stages defined")
errInputNameMissing = errors.New("name not set")
errInputEmpty = errors.New("dataset not set")
errNameConflict = errors.New("name already declared")
errStageInputUnresolved = errors.New("input could not be resolved")
errStageInputMissing = errors.New("input missing")
errObjectIDInvalid = errors.New("object id is invalid")
errNameMissing = errors.New("name not set")
errInputsMissing = errors.New("no inputs defined")
errStagesMissing = errors.New("no stages defined")
errInputNameMissing = errors.New("name not set")
errInputEmpty = errors.New("dataset not set")
errNameConflict = errors.New("name already declared")
errStageInputUnresolved = errors.New("input could not be resolved")
errStageInputMissing = errors.New("input missing")
errMoreThanOneOutputStages = errors.New("too many output stages")

stringType = reflect.TypeOf("")

Expand Down
Loading

0 comments on commit bdbc37a

Please sign in to comment.