Compare commits

...

32 Commits

Author SHA1 Message Date
Michel Hollands
2144cea411 Reformat loki-rules
Signed-off-by: Michel Hollands <michel.hollands@gmail.com>
2024-04-26 14:43:08 +01:00
Michel Hollands
81a017551b This was moved to a separate PR
Signed-off-by: Michel Hollands <michel.hollands@gmail.com>
2024-04-26 14:20:39 +01:00
Michel Hollands
11d80263a7 Update dashboards from Loki
Signed-off-by: Michel Hollands <michel.hollands@gmail.com>
2024-04-26 14:08:59 +01:00
Michel Hollands
cdb0bee56e First draft
Signed-off-by: Michel Hollands <michel.hollands@gmail.com>
2024-04-25 19:15:00 +01:00
Michel Hollands
58171a6a42 Merge pull request #75 from grafana/more_docs_fixes
Small docs fixes
2024-04-25 15:29:02 +01:00
Michel Hollands
c65445384b Merge pull request #72 from grafana/chore/update-tempo-distributed
[dependency] Update the Tempo Distributed subchart
2024-04-25 15:28:37 +01:00
Michel Hollands
1f980f393e Small docs fixes
Signed-off-by: Michel Hollands <michel.hollands@gmail.com>
2024-04-25 15:28:09 +01:00
Michel Hollands
47d9190eda Merge pull request #74 from grafana/add_more_docs
Add docs on how to install dashboards and rules in the cloud
2024-04-25 15:25:26 +01:00
Michel Hollands
5ff9bd16c9 Add docs on how to install dashboards and rules in the cloud
Signed-off-by: Michel Hollands <michel.hollands@gmail.com>
2024-04-25 15:22:18 +01:00
MichelHollands
d6faaf88f5 Update Tempo Distributed 2024-04-25 07:02:32 +00:00
Michel Hollands
2d711f7168 Merge pull request #73 from grafana/update_main_page
Update installation instructions
2024-04-24 10:48:09 +01:00
Michel Hollands
c666bf69c9 Remove whitespace
Signed-off-by: Michel Hollands <michel.hollands@gmail.com>
2024-04-24 10:46:31 +01:00
Michel Hollands
41619b99b1 Update example values.yaml for local mode
Signed-off-by: Michel Hollands <michel.hollands@gmail.com>
2024-04-24 10:45:38 +01:00
Michel Hollands
5923139796 Update installation instructions
Signed-off-by: Michel Hollands <michel.hollands@gmail.com>
2024-04-24 10:40:52 +01:00
Michel Hollands
329d5822ea Merge pull request #71 from grafana/update_installation_instructions
Update installation instructions
2024-04-23 11:12:35 +01:00
Michel Hollands
5498b27ad6 Add repo update step
Signed-off-by: Michel Hollands <michel.hollands@gmail.com>
2024-04-23 11:12:00 +01:00
Michel Hollands
da687315e7 Merge pull request #70 from grafana/chore/update-loki
[dependency] Update the Loki subchart
2024-04-23 11:08:51 +01:00
Michel Hollands
8f20e45c77 Update installation instructions
Signed-off-by: Michel Hollands <michel.hollands@gmail.com>
2024-04-23 09:25:03 +01:00
Michel Hollands
e81b1246f5 Merge pull request #69 from grafana/fix_release_flow
Get Release Helm chart Github action working
2024-04-23 08:51:54 +01:00
MichelHollands
b103fb3434 Update loki 2024-04-23 07:02:51 +00:00
Michel Hollands
9349d2d906 Merge pull request #68 from grafana/chore/update-tempo-distributed
[dependency] Update the Tempo Distributed subchart
2024-04-22 14:02:07 +01:00
Michel Hollands
31536103c8 Add release instructions
Signed-off-by: Michel Hollands <michel.hollands@gmail.com>
2024-04-22 13:52:11 +01:00
Michel Hollands
13c28aa50a Remove repositories
Signed-off-by: Michel Hollands <michel.hollands@gmail.com>
2024-04-22 13:41:38 +01:00
Michel Hollands
385d0dd543 Specify repos
Signed-off-by: Michel Hollands <michel.hollands@gmail.com>
2024-04-22 13:21:08 +01:00
Michel Hollands
458451922d Add owner
Signed-off-by: Michel Hollands <michel.hollands@gmail.com>
2024-04-22 11:21:45 +01:00
Michel Hollands
4b0d457af0 Remove permissions
Signed-off-by: Michel Hollands <michel.hollands@gmail.com>
2024-04-22 11:04:49 +01:00
Michel Hollands
e60b2aecdc Add id-token permission
Signed-off-by: Michel Hollands <michel.hollands@gmail.com>
2024-04-22 10:52:34 +01:00
Michel Hollands
6244de677e Add write permissions for release
Signed-off-by: Michel Hollands <michel.hollands@gmail.com>
2024-04-22 10:49:09 +01:00
Michel Hollands
d14e933e84 Use correct path
Signed-off-by: Michel Hollands <michel.hollands@gmail.com>
2024-04-22 10:37:21 +01:00
Michel Hollands
0210fba39d Use get-vault-secrets action
Signed-off-by: Michel Hollands <michel.hollands@gmail.com>
2024-04-22 10:31:18 +01:00
Michel Hollands
a97fa64880 Use uppercase keys
Signed-off-by: Michel Hollands <michel.hollands@gmail.com>
2024-04-22 10:24:08 +01:00
MichelHollands
0938193982 Update Tempo Distributed 2024-04-22 07:03:00 +00:00
26 changed files with 5844 additions and 265 deletions

View File

@@ -8,9 +8,6 @@ env:
CR_PACKAGE_PATH: "${{ github.workspace }}/.cr-release-packages"
CR_TOOL_PATH: "${{ github.workspace }}/.cr-tool"
CR_VERSION: "1.5.0"
permissions:
contents: read
id-token: write
jobs:
setup:
runs-on: ubuntu-latest
@@ -65,12 +62,23 @@ jobs:
needs: [setup]
runs-on: ubuntu-latest
if: needs.setup.outputs.changed == 'true'
permissions:
contents: write
id-token: write
steps:
- id: get-secrets
uses: grafana/shared-workflows/actions/get-vault-secrets@main
with:
# Secrets placed in the ci/repo/grafana/<repo>/<path> path in Vault
repo_secrets: |
APP_ID=github-app:app-id
PRIVATE_KEY=github-app:private-key
- uses: actions/create-github-app-token@v1
id: app-token
with:
app-id: ${{ secrets.app-id }}
private-key: ${{ secrets.private-key }}
app-id: ${{ env.APP_ID }}
private-key: ${{ env.PRIVATE_KEY }}
owner: ${{ github.repository_owner }}
- name: Checkout
uses: actions/checkout@v4
with:

1
.gitignore vendored Normal file
View File

@@ -0,0 +1 @@
production/

View File

@@ -8,3 +8,15 @@ help:
helm-lint: ## Run helm linter
$(MAKE) -BC charts/meta-monitoring lint
MIXIN_PATH := production/loki-mixin
MIXIN_OUT_PATH_META_MONITORING := production/loki-mixin-compiled-meta-monitoring
mixin: ## Create our version of the mixin
@rm -rf $(MIXIN_PATH)
./scripts/clone_loki_mixin.sh
@rm -rf $(MIXIN_OUT_PATH_META_MONITORING) && mkdir $(MIXIN_OUT_PATH_META_MONITORING)
@cd $(MIXIN_PATH) && jb install
@mixtool generate all --output-alerts $(MIXIN_OUT_PATH_META_MONITORING)/alerts.yaml --output-rules $(MIXIN_OUT_PATH_META_MONITORING)/rules.yaml --directory $(MIXIN_OUT_PATH_META_MONITORING)/dashboards ${MIXIN_PATH}/mixin-meta-monitoring.libsonnet
@cp $(MIXIN_OUT_PATH_META_MONITORING)/dashboards/* charts/meta-monitoring/src/dashboards
@cp $(MIXIN_OUT_PATH_META_MONITORING)/rules.yaml charts/meta-monitoring/src/rules/loki-rules.yaml

View File

@@ -1,20 +1,9 @@
# meta-monitoring-chart
This is a meta-monitoring chart for GEL, GEM and GET. It should be installed in a
separate namespace next to GEM, GEL or GET installations.
This is a meta-monitoring chart for Loki.
Note that this is pre-production software at the moment.
## Preparation
Create a values.yaml file based on the [default one](../charts/meta-monitoring/values.yaml).
1. Add or remove the namespaces to monitor in the `namespacesToMonitor` setting
1. Set the cluster name in the `clusterName` setting. This will be added as a label to all logs, metrics and traces.
1. Create a `meta` namespace.
## Local and cloud modes
The chart has 2 modes: local and cloud. In the local mode logs, metrics and/or traces are sent
@@ -34,12 +23,6 @@ Both modes can be enabled at the same time.
## Installation
```
helm install -n meta --skip-crds -f values.yaml meta ./charts/meta-monitoring
```
If the platform supports CRDs the `--skip-crds` option can be removed. However the CRDs are not used by this chart.
For more instructions including how to update the chart go to the [installation](docs/installation.md) page.
## Supported features
@@ -59,7 +42,6 @@ Most of these features are enabled by default. See the values.yaml file for how
## Caveats
- The [loki.source.kubernetes](https://grafana.com/docs/agent/latest/flow/reference/components/loki.source.kubernetes/) component of the Grafana Agent is used to scrape Kubernetes log files. This component is marked experimental at the moment.
- This has not been tested on Openshift yet.
- The underlying Loki, Mimir and Tempo are at the default size installed by the Helm chart. This might need changing when monitoring bigger Loki, Mimir or Tempo installations.
- MinIO is used as storage at the moment with a limited retention. At the moment this chart cannot be used for monitoring over longer periods.

View File

@@ -1,7 +1,7 @@
dependencies:
- name: loki
repository: https://grafana.github.io/helm-charts
version: 6.2.0
version: 6.3.4
- name: alloy
repository: https://grafana.github.io/helm-charts
version: 0.1.1
@@ -10,9 +10,9 @@ dependencies:
version: 5.3.0
- name: tempo-distributed
repository: https://grafana.github.io/helm-charts
version: 1.9.2
version: 1.9.4
- name: minio
repository: https://charts.min.io
version: 5.1.0
digest: sha256:f9a79bfc30df65ba2ff94a097844f6b3aa41970318d5fb1708b3aeecebbe68d1
generated: "2024-04-16T08:10:03.998180905Z"
digest: sha256:4bb2a4f62c9ebddcd64c28a94126ab3f07d319b028ea7c17ffbdf28d86b3be61
generated: "2024-04-25T07:02:28.663945601Z"

View File

@@ -22,7 +22,7 @@ appVersion: "0.0.1"
dependencies:
- name: loki
repository: https://grafana.github.io/helm-charts
version: 6.2.0
version: 6.3.4
condition: local.logs.enabled
- name: alloy
repository: https://grafana.github.io/helm-charts
@@ -33,7 +33,7 @@ dependencies:
condition: local.metrics.enabled
- name: tempo-distributed
repository: https://grafana.github.io/helm-charts
version: 1.9.2
version: 1.9.4
condition: local.traces.enabled
- name: minio
repository: https://charts.min.io

Binary file not shown.

View File

@@ -64,7 +64,7 @@
"span": 6,
"targets": [
{
"expr": "sum(loki_ingester_memory_chunks{cluster=\"$cluster\", job=~\"$namespace/(loki|enterprise-logs)-write\"})",
"expr": "sum(loki_ingester_memory_chunks{cluster=\"$cluster\", job=~\"$namespace/(ingester.*|(loki|enterprise-logs)-write|loki-single-binary)\"})",
"format": "time_series",
"legendFormat": "series",
"legendLink": null
@@ -111,7 +111,7 @@
"span": 6,
"targets": [
{
"expr": "sum(loki_ingester_memory_chunks{cluster=\"$cluster\", job=~\"$namespace/(loki|enterprise-logs)-write\"}) / sum(loki_ingester_memory_streams{cluster=\"$cluster\", job=~\"$namespace/(loki|enterprise-logs)-write\"})",
"expr": "sum(loki_ingester_memory_chunks{cluster=\"$cluster\", job=~\"$namespace/(ingester.*|(loki|enterprise-logs)-write|loki-single-binary)\"}) / sum(loki_ingester_memory_streams{cluster=\"$cluster\", job=~\"$namespace/(ingester.*|(loki|enterprise-logs)-write|loki-single-binary)\"})",
"format": "time_series",
"legendFormat": "chunks",
"legendLink": null
@@ -171,19 +171,19 @@
"span": 6,
"targets": [
{
"expr": "histogram_quantile(0.99, sum(rate(loki_ingester_chunk_utilization_bucket{cluster=\"$cluster\", job=~\"$namespace/(loki|enterprise-logs)-write\"}[$__rate_interval])) by (le)) * 1",
"expr": "histogram_quantile(0.99, sum(rate(loki_ingester_chunk_utilization_bucket{cluster=\"$cluster\", job=~\"$namespace/(ingester.*|(loki|enterprise-logs)-write|loki-single-binary)\"}[$__rate_interval])) by (le)) * 1",
"format": "time_series",
"legendFormat": "99th Percentile",
"refId": "A"
},
{
"expr": "histogram_quantile(0.50, sum(rate(loki_ingester_chunk_utilization_bucket{cluster=\"$cluster\", job=~\"$namespace/(loki|enterprise-logs)-write\"}[$__rate_interval])) by (le)) * 1",
"expr": "histogram_quantile(0.50, sum(rate(loki_ingester_chunk_utilization_bucket{cluster=\"$cluster\", job=~\"$namespace/(ingester.*|(loki|enterprise-logs)-write|loki-single-binary)\"}[$__rate_interval])) by (le)) * 1",
"format": "time_series",
"legendFormat": "50th Percentile",
"refId": "B"
},
{
"expr": "sum(rate(loki_ingester_chunk_utilization_sum{cluster=\"$cluster\", job=~\"$namespace/(loki|enterprise-logs)-write\"}[$__rate_interval])) * 1 / sum(rate(loki_ingester_chunk_utilization_count{cluster=\"$cluster\", job=~\"$namespace/(loki|enterprise-logs)-write\"}[$__rate_interval]))",
"expr": "sum(rate(loki_ingester_chunk_utilization_sum{cluster=\"$cluster\", job=~\"$namespace/(ingester.*|(loki|enterprise-logs)-write|loki-single-binary)\"}[$__rate_interval])) * 1 / sum(rate(loki_ingester_chunk_utilization_count{cluster=\"$cluster\", job=~\"$namespace/(ingester.*|(loki|enterprise-logs)-write|loki-single-binary)\"}[$__rate_interval]))",
"format": "time_series",
"legendFormat": "Average",
"refId": "C"
@@ -249,19 +249,19 @@
"span": 6,
"targets": [
{
"expr": "histogram_quantile(0.99, sum(rate(loki_ingester_chunk_age_seconds_bucket{cluster=\"$cluster\", job=~\"$namespace/(loki|enterprise-logs)-write\"}[$__rate_interval])) by (le)) * 1e3",
"expr": "histogram_quantile(0.99, sum(rate(loki_ingester_chunk_age_seconds_bucket{cluster=\"$cluster\", job=~\"$namespace/(ingester.*|(loki|enterprise-logs)-write|loki-single-binary)\"}[$__rate_interval])) by (le)) * 1e3",
"format": "time_series",
"legendFormat": "99th Percentile",
"refId": "A"
},
{
"expr": "histogram_quantile(0.50, sum(rate(loki_ingester_chunk_age_seconds_bucket{cluster=\"$cluster\", job=~\"$namespace/(loki|enterprise-logs)-write\"}[$__rate_interval])) by (le)) * 1e3",
"expr": "histogram_quantile(0.50, sum(rate(loki_ingester_chunk_age_seconds_bucket{cluster=\"$cluster\", job=~\"$namespace/(ingester.*|(loki|enterprise-logs)-write|loki-single-binary)\"}[$__rate_interval])) by (le)) * 1e3",
"format": "time_series",
"legendFormat": "50th Percentile",
"refId": "B"
},
{
"expr": "sum(rate(loki_ingester_chunk_age_seconds_sum{cluster=\"$cluster\", job=~\"$namespace/(loki|enterprise-logs)-write\"}[$__rate_interval])) * 1e3 / sum(rate(loki_ingester_chunk_age_seconds_count{cluster=\"$cluster\", job=~\"$namespace/(loki|enterprise-logs)-write\"}[$__rate_interval]))",
"expr": "sum(rate(loki_ingester_chunk_age_seconds_sum{cluster=\"$cluster\", job=~\"$namespace/(ingester.*|(loki|enterprise-logs)-write|loki-single-binary)\"}[$__rate_interval])) * 1e3 / sum(rate(loki_ingester_chunk_age_seconds_count{cluster=\"$cluster\", job=~\"$namespace/(ingester.*|(loki|enterprise-logs)-write|loki-single-binary)\"}[$__rate_interval]))",
"format": "time_series",
"legendFormat": "Average",
"refId": "C"
@@ -339,19 +339,19 @@
"span": 6,
"targets": [
{
"expr": "histogram_quantile(0.99, sum(rate(loki_ingester_chunk_entries_bucket{cluster=\"$cluster\", job=~\"$namespace/(loki|enterprise-logs)-write\"}[$__rate_interval])) by (le)) * 1",
"expr": "histogram_quantile(0.99, sum(rate(loki_ingester_chunk_entries_bucket{cluster=\"$cluster\", job=~\"$namespace/(ingester.*|(loki|enterprise-logs)-write|loki-single-binary)\"}[$__rate_interval])) by (le)) * 1",
"format": "time_series",
"legendFormat": "99th Percentile",
"refId": "A"
},
{
"expr": "histogram_quantile(0.50, sum(rate(loki_ingester_chunk_entries_bucket{cluster=\"$cluster\", job=~\"$namespace/(loki|enterprise-logs)-write\"}[$__rate_interval])) by (le)) * 1",
"expr": "histogram_quantile(0.50, sum(rate(loki_ingester_chunk_entries_bucket{cluster=\"$cluster\", job=~\"$namespace/(ingester.*|(loki|enterprise-logs)-write|loki-single-binary)\"}[$__rate_interval])) by (le)) * 1",
"format": "time_series",
"legendFormat": "50th Percentile",
"refId": "B"
},
{
"expr": "sum(rate(loki_ingester_chunk_entries_sum{cluster=\"$cluster\", job=~\"$namespace/(loki|enterprise-logs)-write\"}[$__rate_interval])) * 1 / sum(rate(loki_ingester_chunk_entries_count{cluster=\"$cluster\", job=~\"$namespace/(loki|enterprise-logs)-write\"}[$__rate_interval]))",
"expr": "sum(rate(loki_ingester_chunk_entries_sum{cluster=\"$cluster\", job=~\"$namespace/(ingester.*|(loki|enterprise-logs)-write|loki-single-binary)\"}[$__rate_interval])) * 1 / sum(rate(loki_ingester_chunk_entries_count{cluster=\"$cluster\", job=~\"$namespace/(ingester.*|(loki|enterprise-logs)-write|loki-single-binary)\"}[$__rate_interval]))",
"format": "time_series",
"legendFormat": "Average",
"refId": "C"
@@ -416,7 +416,7 @@
"span": 6,
"targets": [
{
"expr": "sum(rate(loki_chunk_store_index_entries_per_chunk_sum{cluster=\"$cluster\", job=~\"$namespace/(loki|enterprise-logs)-write\"}[5m])) / sum(rate(loki_chunk_store_index_entries_per_chunk_count{cluster=\"$cluster\", job=~\"$namespace/(loki|enterprise-logs)-write\"}[5m]))",
"expr": "sum(rate(loki_chunk_store_index_entries_per_chunk_sum{cluster=\"$cluster\", job=~\"$namespace/(ingester.*|(loki|enterprise-logs)-write|loki-single-binary)\"}[$__rate_interval])) / sum(rate(loki_chunk_store_index_entries_per_chunk_count{cluster=\"$cluster\", job=~\"$namespace/(ingester.*|(loki|enterprise-logs)-write|loki-single-binary)\"}[$__rate_interval]))",
"format": "time_series",
"legendFormat": "Index Entries",
"legendLink": null
@@ -475,7 +475,7 @@
"span": 6,
"targets": [
{
"expr": "loki_ingester_flush_queue_length{cluster=\"$cluster\", job=~\"$namespace/(loki|enterprise-logs)-write\"} or cortex_ingester_flush_queue_length{cluster=\"$cluster\", job=~\"$namespace/(loki|enterprise-logs)-write\"}",
"expr": "loki_ingester_flush_queue_length{cluster=\"$cluster\", job=~\"$namespace/(ingester.*|(loki|enterprise-logs)-write|loki-single-binary)\"} or cortex_ingester_flush_queue_length{cluster=\"$cluster\", job=~\"$namespace/(ingester.*|(loki|enterprise-logs)-write|loki-single-binary)\"}",
"format": "time_series",
"legendFormat": "{{pod}}",
"legendLink": null
@@ -673,7 +673,7 @@
"stack": true,
"targets": [
{
"expr": "sum by (status) (\n label_replace(label_replace(rate(loki_ingester_chunk_age_seconds_count{cluster=\"$cluster\", job=~\"$namespace/(loki|enterprise-logs)-write\"}[$__rate_interval]),\n \"status\", \"${1}xx\", \"status_code\", \"([0-9])..\"),\n \"status\", \"${1}\", \"status_code\", \"([a-zA-Z]+)\"))\n",
"expr": "sum by (status) (\n label_replace(label_replace(rate(loki_ingester_chunk_age_seconds_count{cluster=\"$cluster\", job=~\"$namespace/(ingester.*|(loki|enterprise-logs)-write|loki-single-binary)\"}[$__rate_interval]),\n \"status\", \"${1}xx\", \"status_code\", \"([0-9])..\"),\n \"status\", \"${1}\", \"status_code\", \"([a-zA-Z]+)\"))\n",
"format": "time_series",
"legendFormat": "{{status}}",
"refId": "A"
@@ -732,7 +732,7 @@
"span": 6,
"targets": [
{
"expr": "sum(rate(loki_ingester_chunks_flushed_total{cluster=\"$cluster\", job=~\"$namespace/(loki|enterprise-logs)-write\"}[$__rate_interval]))",
"expr": "sum(rate(loki_ingester_chunks_flushed_total{cluster=\"$cluster\", job=~\"$namespace/(ingester.*|(loki|enterprise-logs)-write|loki-single-binary)\"}[$__rate_interval]))",
"format": "time_series",
"legendFormat": "{{pod}}",
"legendLink": null
@@ -780,7 +780,7 @@
"stack": true,
"targets": [
{
"expr": "sum by (reason) (rate(loki_ingester_chunks_flushed_total{cluster=\"$cluster\", job=~\"$namespace/(loki|enterprise-logs)-write\"}[$__rate_interval])) / ignoring(reason) group_left sum(rate(loki_ingester_chunks_flushed_total{cluster=\"$cluster\", job=~\"$namespace/(loki|enterprise-logs)-write\"}[$__rate_interval]))",
"expr": "sum by (reason) (rate(loki_ingester_chunks_flushed_total{cluster=\"$cluster\", job=~\"$namespace/(ingester.*|(loki|enterprise-logs)-write|loki-single-binary)\"}[$__rate_interval])) / ignoring(reason) group_left sum(rate(loki_ingester_chunks_flushed_total{cluster=\"$cluster\", job=~\"$namespace/(ingester.*|(loki|enterprise-logs)-write|loki-single-binary)\"}[$__rate_interval]))",
"format": "time_series",
"legendFormat": "{{reason}}",
"legendLink": null
@@ -843,7 +843,7 @@
"span": 12,
"targets": [
{
"expr": "sum by (le) (rate(loki_ingester_chunk_utilization_bucket{cluster=\"$cluster\", job=~\"$namespace/(loki|enterprise-logs)-write\"}[$__rate_interval]))",
"expr": "sum by (le) (rate(loki_ingester_chunk_utilization_bucket{cluster=\"$cluster\", job=~\"$namespace/(ingester.*|(loki|enterprise-logs)-write|loki-single-binary)\"}[$__rate_interval]))",
"format": "heatmap",
"intervalFactor": 2,
"legendFormat": "{{le}}",
@@ -905,7 +905,7 @@
"span": 12,
"targets": [
{
"expr": "sum(rate(loki_ingester_chunk_size_bytes_bucket{cluster=\"$cluster\", job=~\"$namespace/(loki|enterprise-logs)-write\"}[$__rate_interval])) by (le)",
"expr": "sum(rate(loki_ingester_chunk_size_bytes_bucket{cluster=\"$cluster\", job=~\"$namespace/(ingester.*|(loki|enterprise-logs)-write|loki-single-binary)\"}[$__rate_interval])) by (le)",
"format": "heatmap",
"intervalFactor": 2,
"legendFormat": "{{le}}",
@@ -981,19 +981,19 @@
"span": 12,
"targets": [
{
"expr": "histogram_quantile(0.99, sum(rate(loki_ingester_chunk_size_bytes_bucket{cluster=\"$cluster\", job=~\"$namespace/(loki|enterprise-logs)-write\"}[1m])) by (le))",
"expr": "histogram_quantile(0.99, sum(rate(loki_ingester_chunk_size_bytes_bucket{cluster=\"$cluster\", job=~\"$namespace/(ingester.*|(loki|enterprise-logs)-write|loki-single-binary)\"}[$__rate_interval])) by (le))",
"format": "time_series",
"legendFormat": "p99",
"legendLink": null
},
{
"expr": "histogram_quantile(0.90, sum(rate(loki_ingester_chunk_size_bytes_bucket{cluster=\"$cluster\", job=~\"$namespace/(loki|enterprise-logs)-write\"}[1m])) by (le))",
"expr": "histogram_quantile(0.90, sum(rate(loki_ingester_chunk_size_bytes_bucket{cluster=\"$cluster\", job=~\"$namespace/(ingester.*|(loki|enterprise-logs)-write|loki-single-binary)\"}[$__rate_interval])) by (le))",
"format": "time_series",
"legendFormat": "p90",
"legendLink": null
},
{
"expr": "histogram_quantile(0.50, sum(rate(loki_ingester_chunk_size_bytes_bucket{cluster=\"$cluster\", job=~\"$namespace/(loki|enterprise-logs)-write\"}[1m])) by (le))",
"expr": "histogram_quantile(0.50, sum(rate(loki_ingester_chunk_size_bytes_bucket{cluster=\"$cluster\", job=~\"$namespace/(ingester.*|(loki|enterprise-logs)-write|loki-single-binary)\"}[$__rate_interval])) by (le))",
"format": "time_series",
"legendFormat": "p50",
"legendLink": null
@@ -1052,19 +1052,19 @@
"span": 12,
"targets": [
{
"expr": "histogram_quantile(0.5, sum(rate(loki_ingester_chunk_bounds_hours_bucket{cluster=\"$cluster\", job=~\"$namespace/(loki|enterprise-logs)-write\"}[5m])) by (le))",
"expr": "histogram_quantile(0.5, sum(rate(loki_ingester_chunk_bounds_hours_bucket{cluster=\"$cluster\", job=~\"$namespace/(ingester.*|(loki|enterprise-logs)-write|loki-single-binary)\"}[$__rate_interval])) by (le))",
"format": "time_series",
"legendFormat": "p50",
"legendLink": null
},
{
"expr": "histogram_quantile(0.99, sum(rate(loki_ingester_chunk_bounds_hours_bucket{cluster=\"$cluster\", job=~\"$namespace/(loki|enterprise-logs)-write\"}[5m])) by (le))",
"expr": "histogram_quantile(0.99, sum(rate(loki_ingester_chunk_bounds_hours_bucket{cluster=\"$cluster\", job=~\"$namespace/(ingester.*|(loki|enterprise-logs)-write|loki-single-binary)\"}[$__rate_interval])) by (le))",
"format": "time_series",
"legendFormat": "p99",
"legendLink": null
},
{
"expr": "sum(rate(loki_ingester_chunk_bounds_hours_sum{cluster=\"$cluster\", job=~\"$namespace/(loki|enterprise-logs)-write\"}[5m])) / sum(rate(loki_ingester_chunk_bounds_hours_count{cluster=\"$cluster\", job=~\"$namespace/(loki|enterprise-logs)-write\"}[5m]))",
"expr": "sum(rate(loki_ingester_chunk_bounds_hours_sum{cluster=\"$cluster\", job=~\"$namespace/(ingester.*|(loki|enterprise-logs)-write|loki-single-binary)\"}[$__rate_interval])) / sum(rate(loki_ingester_chunk_bounds_hours_count{cluster=\"$cluster\", job=~\"$namespace/(ingester.*|(loki|enterprise-logs)-write|loki-single-binary)\"}[$__rate_interval]))",
"format": "time_series",
"legendFormat": "avg",
"legendLink": null

View File

@@ -579,7 +579,7 @@
"span": 6,
"targets": [
{
"expr": "sum(rate(loki_compactor_deleted_lines{cluster=~\"$cluster\",job=~\"$namespace/(loki|enterprise-logs)-read\"}[$__rate_interval])) by (user)",
"expr": "sum(rate(loki_compactor_deleted_lines{cluster=~\"$cluster\", namespace=~\"$namespace\", pod=~\"(compactor|(loki|enterprise-logs)-backend.*|loki-single-binary)\"}[$__rate_interval])) by (user)",
"format": "time_series",
"legendFormat": "{{user}}",
"legendLink": null
@@ -606,7 +606,7 @@
"span": 6,
"targets": [
{
"expr": "{cluster=~\"$cluster\", namespace=~\"$namespace\", container=\"compactor\"} |~ \"Started processing delete request|delete request for user marked as processed\" | logfmt | line_format \"{{.ts}} user={{.user}} delete_request_id={{.delete_request_id}} msg={{.msg}}\" ",
"expr": "{cluster=~\"$cluster\", namespace=~\"$namespace\", pod=~\"(compactor|(loki|enterprise-logs)-backend.*|loki-single-binary)\"} |~ \"Started processing delete request|delete request for user marked as processed\" | logfmt | line_format \"{{.ts}} user={{.user}} delete_request_id={{.delete_request_id}} msg={{.msg}}\" ",
"refId": "A"
}
],
@@ -619,7 +619,7 @@
"span": 6,
"targets": [
{
"expr": "{cluster=~\"$cluster\", namespace=~\"$namespace\", container=\"compactor\"} |~ \"delete request for user added\" | logfmt | line_format \"{{.ts}} user={{.user}} query='{{.query}}'\"",
"expr": "{cluster=~\"$cluster\", namespace=~\"$namespace\", pod=~\"(compactor|(loki|enterprise-logs)-backend.*|loki-single-binary)\"} |~ \"delete request for user added\" | logfmt | line_format \"{{.ts}} user={{.user}} query='{{.query}}'\"",
"refId": "A"
}
],

View File

@@ -114,6 +114,11 @@
"dashLength": 10,
"dashes": false,
"datasource": "$datasource",
"fieldConfig": {
"defaults": {
"unit": "s"
}
},
"fill": 1,
"fillGradient": 0,
"gridPos": {
@@ -236,7 +241,7 @@
"steppedLine": false,
"targets": [
{
"expr": "sum(rate(container_cpu_usage_seconds_total{cluster=\"$cluster\", namespace=\"$namespace\", pod=~\"$deployment.*\", pod=~\"$pod\", container=~\"$container\"}[5m]))",
"expr": "sum(rate(container_cpu_usage_seconds_total{cluster=\"$cluster\", namespace=\"$namespace\", pod=~\"$deployment.*\", pod=~\"$pod\", container=~\"$container\"}[$__rate_interval]))",
"refId": "A"
}
],
@@ -287,6 +292,11 @@
"dashLength": 10,
"dashes": false,
"datasource": "$datasource",
"fieldConfig": {
"defaults": {
"unit": "bytes"
}
},
"fill": 1,
"fillGradient": 0,
"gridPos": {
@@ -373,6 +383,11 @@
"dashLength": 10,
"dashes": false,
"datasource": "$datasource",
"fieldConfig": {
"defaults": {
"unit": "binBps"
}
},
"fill": 1,
"fillGradient": 0,
"gridPos": {
@@ -408,7 +423,7 @@
"steppedLine": false,
"targets": [
{
"expr": "sum(rate(container_network_transmit_bytes_total{cluster=\"$cluster\", namespace=\"$namespace\", pod=~\"$deployment.*\", pod=~\"$pod\"}[5m]))",
"expr": "sum(rate(container_network_transmit_bytes_total{cluster=\"$cluster\", namespace=\"$namespace\", pod=~\"$deployment.*\", pod=~\"$pod\"}[$__rate_interval]))",
"refId": "A"
}
],
@@ -459,6 +474,11 @@
"dashLength": 10,
"dashes": false,
"datasource": "$datasource",
"fieldConfig": {
"defaults": {
"unit": "binBps"
}
},
"fill": 1,
"fillGradient": 0,
"gridPos": {
@@ -494,7 +514,7 @@
"steppedLine": false,
"targets": [
{
"expr": "sum(rate(container_network_receive_bytes_total{cluster=\"$cluster\", namespace=\"$namespace\", pod=~\"$deployment.*\", pod=~\"$pod\"}[5m]))",
"expr": "sum(rate(container_network_receive_bytes_total{cluster=\"$cluster\", namespace=\"$namespace\", pod=~\"$deployment.*\", pod=~\"$pod\"}[$__rate_interval]))",
"refId": "A"
}
],
@@ -632,6 +652,11 @@
"dashLength": 10,
"dashes": false,
"datasource": "$datasource",
"fieldConfig": {
"defaults": {
"unit": "ops"
}
},
"fill": 1,
"fillGradient": 0,
"gridPos": {
@@ -667,7 +692,7 @@
"steppedLine": false,
"targets": [
{
"expr": "sum(rate(promtail_custom_bad_words_total{cluster=\"$cluster\", exported_namespace=\"$namespace\", exported_pod=~\"$deployment.*\", exported_pod=~\"$pod\", container=~\"$container\"}[5m])) by (level)",
"expr": "sum(rate(promtail_custom_bad_words_total{cluster=\"$cluster\", exported_namespace=\"$namespace\", exported_pod=~\"$deployment.*\", exported_pod=~\"$pod\", container=~\"$container\"}[$__rate_interval])) by (level)",
"legendFormat": "{{level}}",
"refId": "A"
}
@@ -719,6 +744,11 @@
"dashLength": 10,
"dashes": false,
"datasource": "$loki_datasource",
"fieldConfig": {
"defaults": {
"unit": "ops"
}
},
"fill": 1,
"fillGradient": 0,
"gridPos": {
@@ -771,7 +801,7 @@
"steppedLine": false,
"targets": [
{
"expr": "sum(rate({cluster=\"$cluster\", namespace=\"$namespace\", pod=~\"$deployment.*\", pod=~\"$pod\", container=~\"$container\" } |logfmt| level=\"$level\" |= \"$filter\" [5m])) by (level)",
"expr": "sum(rate({cluster=\"$cluster\", namespace=\"$namespace\", pod=~\"$deployment.*\", pod=~\"$pod\", container=~\"$container\" } |logfmt| level=\"$level\" |= \"$filter\" | __error__=\"\" [$__auto])) by (level)",
"intervalFactor": 3,
"legendFormat": "{{level}}",
"refId": "A"

View File

@@ -300,7 +300,8 @@
"value": 80
}
]
}
},
"unit": "s"
},
"overrides": [ ]
},

File diff suppressed because it is too large Load Diff

File diff suppressed because it is too large Load Diff

File diff suppressed because it is too large Load Diff

View File

@@ -103,19 +103,19 @@
"span": 4,
"targets": [
{
"expr": "sum by(pod) (rate(container_cpu_usage_seconds_total{cluster=~\"$cluster\", namespace=~\"$namespace\", container=\"loki\", pod=~\"(loki|enterprise-logs)-read.*\"}[$__rate_interval]))",
"expr": "sum by(pod) (rate(container_cpu_usage_seconds_total{cluster=~\"$cluster\", namespace=~\"$namespace\", pod=~\"(compactor.*|(loki|enterprise-logs)-backend.*|loki-single-binary)\"}[$__rate_interval]))",
"format": "time_series",
"legendFormat": "{{pod}}",
"legendLink": null
},
{
"expr": "min(kube_pod_container_resource_requests{cluster=~\"$cluster\", namespace=~\"$namespace\", container=\"loki\", pod=~\"(loki|enterprise-logs)-read.*\", resource=\"cpu\"} > 0)",
"expr": "min(kube_pod_container_resource_requests{cluster=~\"$cluster\", namespace=~\"$namespace\", pod=~\"(compactor.*|(loki|enterprise-logs)-backend.*|loki-single-binary)\", resource=\"cpu\"} > 0)",
"format": "time_series",
"legendFormat": "request",
"legendLink": null
},
{
"expr": "min(container_spec_cpu_quota{cluster=~\"$cluster\", namespace=~\"$namespace\", container=\"loki\", pod=~\"(loki|enterprise-logs)-read.*\"} / container_spec_cpu_period{cluster=~\"$cluster\", namespace=~\"$namespace\", container=\"loki\", pod=~\"(loki|enterprise-logs)-read.*\"})",
"expr": "min(container_spec_cpu_quota{cluster=~\"$cluster\", namespace=~\"$namespace\", pod=~\"(compactor.*|(loki|enterprise-logs)-backend.*|loki-single-binary)\"} / container_spec_cpu_period{cluster=~\"$cluster\", namespace=~\"$namespace\", pod=~\"(compactor.*|(loki|enterprise-logs)-backend.*|loki-single-binary)\"})",
"format": "time_series",
"legendFormat": "limit",
"legendLink": null
@@ -204,19 +204,19 @@
"span": 4,
"targets": [
{
"expr": "max by(pod) (container_memory_working_set_bytes{cluster=~\"$cluster\", namespace=~\"$namespace\", container=\"loki\", pod=~\"(loki|enterprise-logs)-read.*\"})",
"expr": "max by(pod) (container_memory_working_set_bytes{cluster=~\"$cluster\", namespace=~\"$namespace\", pod=~\"(compactor.*|(loki|enterprise-logs)-backend.*|loki-single-binary)\"})",
"format": "time_series",
"legendFormat": "{{pod}}",
"legendLink": null
},
{
"expr": "min(kube_pod_container_resource_requests{cluster=~\"$cluster\", namespace=~\"$namespace\", container=\"loki\", pod=~\"(loki|enterprise-logs)-read.*\", resource=\"memory\"} > 0)",
"expr": "min(kube_pod_container_resource_requests{cluster=~\"$cluster\", namespace=~\"$namespace\", pod=~\"(compactor.*|(loki|enterprise-logs)-backend.*|loki-single-binary)\", resource=\"memory\"} > 0)",
"format": "time_series",
"legendFormat": "request",
"legendLink": null
},
{
"expr": "min(container_spec_memory_limit_bytes{cluster=~\"$cluster\", namespace=~\"$namespace\", container=\"loki\", pod=~\"(loki|enterprise-logs)-read.*\"} > 0)",
"expr": "min(container_spec_memory_limit_bytes{cluster=~\"$cluster\", namespace=~\"$namespace\", pod=~\"(compactor.*|(loki|enterprise-logs)-backend.*|loki-single-binary)\"} > 0)",
"format": "time_series",
"legendFormat": "limit",
"legendLink": null
@@ -266,7 +266,7 @@
"span": 4,
"targets": [
{
"expr": "sum by(pod) (go_memstats_heap_inuse_bytes{cluster=~\"$cluster\", job=~\"($namespace)/(loki|enterprise-logs)-read\"})",
"expr": "sum by(pod) (go_memstats_heap_inuse_bytes{cluster=~\"$cluster\", job=~\"($namespace)/\"(compactor|(loki|enterprise-logs)-backend.*|loki-single-binary)\"\"})",
"format": "time_series",
"legendFormat": "{{pod}}",
"legendLink": null
@@ -1367,7 +1367,7 @@
"span": 12,
"targets": [
{
"expr": "{cluster=~\"$cluster\", job=~\"($namespace)/(loki|enterprise-logs)-read\"}",
"expr": "{cluster=~\"$cluster\", job=~\"($namespace)/\"(compactor|(loki|enterprise-logs)-backend.*|loki-single-binary)\"\"}",
"refId": "A"
}
],

View File

@@ -22,6 +22,270 @@
],
"refresh": "10s",
"rows": [
{
"collapse": false,
"height": "250px",
"panels": [
{
"datasource": "$datasource",
"fieldConfig": {
"defaults": {
"custom": {
"drawStyle": "line",
"fillOpacity": 10,
"lineWidth": 1,
"pointSize": 5,
"showPoints": "never",
"spanNulls": false,
"stacking": {
"group": "A",
"mode": "none"
}
},
"thresholds": {
"mode": "absolute",
"steps": [ ]
},
"unit": "short"
},
"overrides": [
{
"matcher": {
"id": "byName",
"options": "request"
},
"properties": [
{
"id": "color",
"value": {
"fixedColor": "#FFC000",
"mode": "fixed"
}
},
{
"id": "custom.fillOpacity",
"value": 0
}
]
},
{
"matcher": {
"id": "byName",
"options": "limit"
},
"properties": [
{
"id": "color",
"value": {
"fixedColor": "#E02F44",
"mode": "fixed"
}
},
{
"id": "custom.fillOpacity",
"value": 0
}
]
}
]
},
"id": 1,
"links": [ ],
"options": {
"legend": {
"showLegend": true
},
"tooltip": {
"mode": "single",
"sort": "none"
}
},
"span": 4,
"targets": [
{
"expr": "sum by(pod) (rate(container_cpu_usage_seconds_total{cluster=~\"$cluster\", namespace=~\"$namespace\", container=~\"distributor\"}[$__rate_interval]))",
"format": "time_series",
"legendFormat": "{{pod}}",
"legendLink": null
},
{
"expr": "min(kube_pod_container_resource_requests{cluster=~\"$cluster\", namespace=~\"$namespace\", container=~\"distributor\", resource=\"cpu\"} > 0)",
"format": "time_series",
"legendFormat": "request",
"legendLink": null
},
{
"expr": "min(container_spec_cpu_quota{cluster=~\"$cluster\", namespace=~\"$namespace\", container=~\"distributor\"} / container_spec_cpu_period{cluster=~\"$cluster\", namespace=~\"$namespace\", container=~\"distributor\"})",
"format": "time_series",
"legendFormat": "limit",
"legendLink": null
}
],
"title": "CPU",
"tooltip": {
"sort": 2
},
"type": "timeseries"
},
{
"datasource": "$datasource",
"fieldConfig": {
"defaults": {
"custom": {
"drawStyle": "line",
"fillOpacity": 10,
"lineWidth": 1,
"pointSize": 5,
"showPoints": "never",
"spanNulls": false,
"stacking": {
"group": "A",
"mode": "none"
}
},
"thresholds": {
"mode": "absolute",
"steps": [ ]
},
"unit": "bytes"
},
"overrides": [
{
"matcher": {
"id": "byName",
"options": "request"
},
"properties": [
{
"id": "color",
"value": {
"fixedColor": "#FFC000",
"mode": "fixed"
}
},
{
"id": "custom.fillOpacity",
"value": 0
}
]
},
{
"matcher": {
"id": "byName",
"options": "limit"
},
"properties": [
{
"id": "color",
"value": {
"fixedColor": "#E02F44",
"mode": "fixed"
}
},
{
"id": "custom.fillOpacity",
"value": 0
}
]
}
]
},
"id": 2,
"links": [ ],
"options": {
"legend": {
"showLegend": true
},
"tooltip": {
"mode": "single",
"sort": "none"
}
},
"span": 4,
"targets": [
{
"expr": "max by(pod) (container_memory_working_set_bytes{cluster=~\"$cluster\", namespace=~\"$namespace\", container=~\"distributor\"})",
"format": "time_series",
"legendFormat": "{{pod}}",
"legendLink": null
},
{
"expr": "min(kube_pod_container_resource_requests{cluster=~\"$cluster\", namespace=~\"$namespace\", container=~\"distributor\", resource=\"memory\"} > 0)",
"format": "time_series",
"legendFormat": "request",
"legendLink": null
},
{
"expr": "min(container_spec_memory_limit_bytes{cluster=~\"$cluster\", namespace=~\"$namespace\", container=~\"distributor\"} > 0)",
"format": "time_series",
"legendFormat": "limit",
"legendLink": null
}
],
"title": "Memory (workingset)",
"tooltip": {
"sort": 2
},
"type": "timeseries"
},
{
"datasource": "$datasource",
"fieldConfig": {
"defaults": {
"custom": {
"drawStyle": "line",
"fillOpacity": 10,
"lineWidth": 1,
"pointSize": 5,
"showPoints": "never",
"spanNulls": false,
"stacking": {
"group": "A",
"mode": "none"
}
},
"thresholds": {
"mode": "absolute",
"steps": [ ]
},
"unit": "bytes"
},
"overrides": [ ]
},
"id": 3,
"links": [ ],
"options": {
"legend": {
"showLegend": true
},
"tooltip": {
"mode": "single",
"sort": "none"
}
},
"span": 4,
"targets": [
{
"expr": "sum by(pod) (go_memstats_heap_inuse_bytes{cluster=~\"$cluster\", job=~\"($namespace)/distributor\"})",
"format": "time_series",
"legendFormat": "{{pod}}",
"legendLink": null
}
],
"title": "Memory (go heap inuse)",
"tooltip": {
"sort": 2
},
"type": "timeseries"
}
],
"repeat": null,
"repeatIteration": null,
"repeatRowId": null,
"showTitle": true,
"title": "Distributor",
"titleSize": "h6"
},
{
"collapse": false,
"collapsed": false,
@@ -51,7 +315,7 @@
"overrides": [ ]
},
"gridPos": { },
"id": 1,
"id": 4,
"links": [ ],
"options": {
"legend": {
@@ -64,7 +328,7 @@
},
"targets": [
{
"expr": "sum by(pod) (loki_ingester_memory_streams{cluster=~\"$cluster\", job=~\"($namespace)/(loki|enterprise-logs)-write\"})",
"expr": "sum by(pod) (loki_ingester_memory_streams{cluster=~\"$cluster\", job=~\"($namespace)/(ingester.*|(loki|enterprise-logs)-write|loki-single-binary)\"})",
"format": "time_series",
"legendFormat": "{{pod}}",
"legendLink": null
@@ -140,7 +404,7 @@
]
},
"gridPos": { },
"id": 2,
"id": 5,
"links": [ ],
"options": {
"legend": {
@@ -153,19 +417,19 @@
},
"targets": [
{
"expr": "sum by(pod) (rate(container_cpu_usage_seconds_total{cluster=~\"$cluster\", namespace=~\"$namespace\", container=\"loki\", pod=~\"(loki|enterprise-logs)-write.*\"}[$__rate_interval]))",
"expr": "sum by(pod) (rate(container_cpu_usage_seconds_total{cluster=~\"$cluster\", namespace=~\"$namespace\", container=~\"loki|ingester\", pod=~\"(ingester.*|(loki|enterprise-logs)-write.*|loki-single-binary)\"}[$__rate_interval]))",
"format": "time_series",
"legendFormat": "{{pod}}",
"legendLink": null
},
{
"expr": "min(kube_pod_container_resource_requests{cluster=~\"$cluster\", namespace=~\"$namespace\", container=\"loki\", pod=~\"(loki|enterprise-logs)-write.*\", resource=\"cpu\"} > 0)",
"expr": "min(kube_pod_container_resource_requests{cluster=~\"$cluster\", namespace=~\"$namespace\", container=~\"loki|ingester\", pod=~\"(ingester.*|(loki|enterprise-logs)-write.*|loki-single-binary)\", resource=\"cpu\"} > 0)",
"format": "time_series",
"legendFormat": "request",
"legendLink": null
},
{
"expr": "min(container_spec_cpu_quota{cluster=~\"$cluster\", namespace=~\"$namespace\", container=\"loki\", pod=~\"(loki|enterprise-logs)-write.*\"} / container_spec_cpu_period{cluster=~\"$cluster\", namespace=~\"$namespace\", container=\"loki\", pod=~\"(loki|enterprise-logs)-write.*\"})",
"expr": "min(container_spec_cpu_quota{cluster=~\"$cluster\", namespace=~\"$namespace\", container=~\"loki|ingester\", pod=~\"(ingester.*|(loki|enterprise-logs)-write.*|loki-single-binary)\"} / container_spec_cpu_period{cluster=~\"$cluster\", namespace=~\"$namespace\", container=~\"loki|ingester\", pod=~\"(ingester.*|(loki|enterprise-logs)-write.*|loki-single-binary)\"})",
"format": "time_series",
"legendFormat": "limit",
"legendLink": null
@@ -241,7 +505,7 @@
]
},
"gridPos": { },
"id": 3,
"id": 6,
"links": [ ],
"options": {
"legend": {
@@ -254,19 +518,19 @@
},
"targets": [
{
"expr": "max by(pod) (container_memory_working_set_bytes{cluster=~\"$cluster\", namespace=~\"$namespace\", container=\"loki\", pod=~\"(loki|enterprise-logs)-write.*\"})",
"expr": "max by(pod) (container_memory_working_set_bytes{cluster=~\"$cluster\", namespace=~\"$namespace\", container=~\"loki|ingester\", pod=~\"(ingester.*|(loki|enterprise-logs)-write.*|loki-single-binary)\"})",
"format": "time_series",
"legendFormat": "{{pod}}",
"legendLink": null
},
{
"expr": "min(kube_pod_container_resource_requests{cluster=~\"$cluster\", namespace=~\"$namespace\", container=\"loki\", pod=~\"(loki|enterprise-logs)-write.*\", resource=\"memory\"} > 0)",
"expr": "min(kube_pod_container_resource_requests{cluster=~\"$cluster\", namespace=~\"$namespace\", container=~\"loki|ingester\", pod=~\"(ingester.*|(loki|enterprise-logs)-write.*|loki-single-binary)\", resource=\"memory\"} > 0)",
"format": "time_series",
"legendFormat": "request",
"legendLink": null
},
{
"expr": "min(container_spec_memory_limit_bytes{cluster=~\"$cluster\", namespace=~\"$namespace\", container=\"loki\", pod=~\"(loki|enterprise-logs)-write.*\"} > 0)",
"expr": "min(container_spec_memory_limit_bytes{cluster=~\"$cluster\", namespace=~\"$namespace\", container=~\"loki|ingester\", pod=~\"(ingester.*|(loki|enterprise-logs)-write.*|loki-single-binary)\"} > 0)",
"format": "time_series",
"legendFormat": "limit",
"legendLink": null
@@ -303,7 +567,7 @@
"overrides": [ ]
},
"gridPos": { },
"id": 4,
"id": 7,
"links": [ ],
"options": {
"legend": {
@@ -316,7 +580,7 @@
},
"targets": [
{
"expr": "sum by(pod) (go_memstats_heap_inuse_bytes{cluster=~\"$cluster\", job=~\"($namespace)/(loki|enterprise-logs)-write\"})",
"expr": "sum by(pod) (go_memstats_heap_inuse_bytes{cluster=~\"$cluster\", job=~\"($namespace)/(ingester.*|(loki|enterprise-logs)-write|loki-single-binary)\"})",
"format": "time_series",
"legendFormat": "{{pod}}",
"legendLink": null
@@ -353,7 +617,7 @@
"overrides": [ ]
},
"gridPos": { },
"id": 5,
"id": 8,
"links": [ ],
"options": {
"legend": {
@@ -366,7 +630,7 @@
},
"targets": [
{
"expr": "sum by(instance, pod, device) (rate(node_disk_written_bytes_total[$__rate_interval])) + ignoring(pod) group_right() (label_replace(count by(instance, pod, device) (container_fs_writes_bytes_total{cluster=~\"$cluster\", namespace=~\"$namespace\", container=\"loki\", pod=~\"(loki|enterprise-logs)-write.*\", device!~\".*sda.*\"}), \"device\", \"$1\", \"device\", \"/dev/(.*)\") * 0)\n",
"expr": "sum by(instance, pod, device) (rate(node_disk_written_bytes_total[$__rate_interval])) + ignoring(pod) group_right() (label_replace(count by(instance, pod, device) (container_fs_writes_bytes_total{cluster=~\"$cluster\", namespace=~\"$namespace\", container=~\"loki|ingester\", pod=~\"(ingester.*|(loki|enterprise-logs)-write.*|loki-single-binary)\", device!~\".*sda.*\"}), \"device\", \"$1\", \"device\", \"/dev/(.*)\") * 0)\n",
"format": "time_series",
"legendFormat": "{{pod}} - {{device}}",
"legendLink": null
@@ -400,7 +664,7 @@
"overrides": [ ]
},
"gridPos": { },
"id": 6,
"id": 9,
"links": [ ],
"options": {
"legend": {
@@ -413,7 +677,7 @@
},
"targets": [
{
"expr": "sum by(instance, pod, device) (rate(node_disk_read_bytes_total[$__rate_interval])) + ignoring(pod) group_right() (label_replace(count by(instance, pod, device) (container_fs_writes_bytes_total{cluster=~\"$cluster\", namespace=~\"$namespace\", container=\"loki\", pod=~\"(loki|enterprise-logs)-write.*\", device!~\".*sda.*\"}), \"device\", \"$1\", \"device\", \"/dev/(.*)\") * 0)\n",
"expr": "sum by(instance, pod, device) (rate(node_disk_read_bytes_total[$__rate_interval])) + ignoring(pod) group_right() (label_replace(count by(instance, pod, device) (container_fs_writes_bytes_total{cluster=~\"$cluster\", namespace=~\"$namespace\", container=~\"loki|ingester\", pod=~\"(ingester.*|(loki|enterprise-logs)-write.*|loki-single-binary)\", device!~\".*sda.*\"}), \"device\", \"$1\", \"device\", \"/dev/(.*)\") * 0)\n",
"format": "time_series",
"legendFormat": "{{pod}} - {{device}}",
"legendLink": null
@@ -447,7 +711,7 @@
"overrides": [ ]
},
"gridPos": { },
"id": 7,
"id": 10,
"links": [ ],
"options": {
"legend": {
@@ -460,7 +724,7 @@
},
"targets": [
{
"expr": "max by(persistentvolumeclaim) (kubelet_volume_stats_used_bytes{cluster=~\"$cluster\", namespace=~\"$namespace\"} / kubelet_volume_stats_capacity_bytes{cluster=~\"$cluster\", namespace=~\"$namespace\"}) and count by(persistentvolumeclaim) (kube_persistentvolumeclaim_labels{cluster=~\"$cluster\", namespace=~\"$namespace\",label_name=~\"(loki|enterprise-logs)-write.*\"})",
"expr": "max by(persistentvolumeclaim) (kubelet_volume_stats_used_bytes{cluster=~\"$cluster\", namespace=~\"$namespace\"} / kubelet_volume_stats_capacity_bytes{cluster=~\"$cluster\", namespace=~\"$namespace\"}) and count by(persistentvolumeclaim) (kube_persistentvolumeclaim_labels{cluster=~\"$cluster\", namespace=~\"$namespace\",label_name=~\"(ingester.*|(loki|enterprise-logs)-write|loki-single-binary).*\"})",
"format": "time_series",
"legendFormat": "{{persistentvolumeclaim}}",
"legendLink": null
@@ -474,7 +738,7 @@
"repeatIteration": null,
"repeatRowId": null,
"showTitle": true,
"title": "Write path",
"title": "Ingester",
"titleSize": "h6",
"type": "row"
}

View File

@@ -215,7 +215,7 @@
"stack": true,
"targets": [
{
"expr": "sum by (status) (\n label_replace(label_replace(rate(loki_request_duration_seconds_count{cluster=~\"$cluster\",job=~\"($namespace)/(loki|enterprise-logs)-write\", route=~\"api_prom_push|loki_api_v1_push|/httpgrpc.HTTP/Handle\"}[$__rate_interval]),\n \"status\", \"${1}xx\", \"status_code\", \"([0-9])..\"),\n \"status\", \"${1}\", \"status_code\", \"([a-zA-Z]+)\"))\n",
"expr": "sum by (status) (\n label_replace(label_replace(rate(loki_request_duration_seconds_count{cluster=~\"$cluster\",job=~\"($namespace)/(distributor|(loki|enterprise-logs)-write|loki-single-binary)\", route=~\"api_prom_push|loki_api_v1_push|/httpgrpc.HTTP/Handle\"}[$__rate_interval]),\n \"status\", \"${1}xx\", \"status_code\", \"([0-9])..\"),\n \"status\", \"${1}\", \"status_code\", \"([a-zA-Z]+)\"))\n",
"format": "time_series",
"legendFormat": "{{status}}",
"refId": "A"
@@ -263,7 +263,7 @@
"span": 6,
"targets": [
{
"expr": "histogram_quantile(0.99, sum by (le) (cluster_job_route:loki_request_duration_seconds_bucket:sum_rate{cluster=~\"$cluster\", job=~\"($namespace)/(loki|enterprise-logs)-write\", route=~\"api_prom_push|loki_api_v1_push|/httpgrpc.HTTP/Handle\"})) * 1e3",
"expr": "histogram_quantile(0.99, sum by (le) (cluster_job_route:loki_request_duration_seconds_bucket:sum_rate{cluster=~\"$cluster\", job=~\"($namespace)/(distributor|(loki|enterprise-logs)-write|loki-single-binary)\", route=~\"api_prom_push|loki_api_v1_push|/httpgrpc.HTTP/Handle\"})) * 1e3",
"format": "time_series",
"intervalFactor": 2,
"legendFormat": "99th Percentile",
@@ -271,7 +271,7 @@
"step": 10
},
{
"expr": "histogram_quantile(0.50, sum by (le) (cluster_job_route:loki_request_duration_seconds_bucket:sum_rate{cluster=~\"$cluster\", job=~\"($namespace)/(loki|enterprise-logs)-write\", route=~\"api_prom_push|loki_api_v1_push|/httpgrpc.HTTP/Handle\"})) * 1e3",
"expr": "histogram_quantile(0.50, sum by (le) (cluster_job_route:loki_request_duration_seconds_bucket:sum_rate{cluster=~\"$cluster\", job=~\"($namespace)/(distributor|(loki|enterprise-logs)-write|loki-single-binary)\", route=~\"api_prom_push|loki_api_v1_push|/httpgrpc.HTTP/Handle\"})) * 1e3",
"format": "time_series",
"intervalFactor": 2,
"legendFormat": "50th Percentile",
@@ -279,7 +279,7 @@
"step": 10
},
{
"expr": "1e3 * sum(cluster_job_route:loki_request_duration_seconds_sum:sum_rate{cluster=~\"$cluster\", job=~\"($namespace)/(loki|enterprise-logs)-write\", route=~\"api_prom_push|loki_api_v1_push|/httpgrpc.HTTP/Handle\"}) / sum(cluster_job_route:loki_request_duration_seconds_count:sum_rate{cluster=~\"$cluster\", job=~\"($namespace)/(loki|enterprise-logs)-write\", route=~\"api_prom_push|loki_api_v1_push|/httpgrpc.HTTP/Handle\"})",
"expr": "1e3 * sum(cluster_job_route:loki_request_duration_seconds_sum:sum_rate{cluster=~\"$cluster\", job=~\"($namespace)/(distributor|(loki|enterprise-logs)-write|loki-single-binary)\", route=~\"api_prom_push|loki_api_v1_push|/httpgrpc.HTTP/Handle\"}) / sum(cluster_job_route:loki_request_duration_seconds_count:sum_rate{cluster=~\"$cluster\", job=~\"($namespace)/(distributor|(loki|enterprise-logs)-write|loki-single-binary)\", route=~\"api_prom_push|loki_api_v1_push|/httpgrpc.HTTP/Handle\"})",
"format": "time_series",
"intervalFactor": 2,
"legendFormat": "Average",
@@ -313,7 +313,7 @@
"repeatIteration": null,
"repeatRowId": null,
"showTitle": true,
"title": "Write Path",
"title": "Distributor",
"titleSize": "h6"
},
{
@@ -358,7 +358,7 @@
"span": 6,
"targets": [
{
"expr": "sum (rate(loki_distributor_structured_metadata_bytes_received_total{cluster=~\"$cluster\",job=~\"($namespace)/(loki|enterprise-logs)-write\",}[$__rate_interval])) / sum(rate(loki_distributor_bytes_received_total{cluster=~\"$cluster\",job=~\"($namespace)/(loki|enterprise-logs)-write\",}[$__rate_interval]))",
"expr": "sum (rate(loki_distributor_structured_metadata_bytes_received_total{cluster=~\"$cluster\",job=~\"($namespace)/(distributor|(loki|enterprise-logs)-write|loki-single-binary)\",}[$__rate_interval])) / sum(rate(loki_distributor_bytes_received_total{cluster=~\"$cluster\",job=~\"($namespace)/(distributor|(loki|enterprise-logs)-write|loki-single-binary)\",}[$__rate_interval]))",
"format": "time_series",
"legendFormat": "bytes",
"legendLink": null
@@ -406,7 +406,7 @@
"stack": true,
"targets": [
{
"expr": "sum by (tenant) (rate(loki_distributor_structured_metadata_bytes_received_total{cluster=~\"$cluster\",job=~\"($namespace)/(loki|enterprise-logs)-write\",}[$__rate_interval])) / ignoring(tenant) group_left sum(rate(loki_distributor_structured_metadata_bytes_received_total{cluster=~\"$cluster\",job=~\"($namespace)/(loki|enterprise-logs)-write\",}[$__rate_interval]))",
"expr": "sum by (tenant) (rate(loki_distributor_structured_metadata_bytes_received_total{cluster=~\"$cluster\",job=~\"($namespace)/(distributor|(loki|enterprise-logs)-write|loki-single-binary)\",}[$__rate_interval])) / ignoring(tenant) group_left sum(rate(loki_distributor_structured_metadata_bytes_received_total{cluster=~\"$cluster\",job=~\"($namespace)/(distributor|(loki|enterprise-logs)-write|loki-single-binary)\",}[$__rate_interval]))",
"format": "time_series",
"legendFormat": "{{tenant}}",
"legendLink": null
@@ -438,7 +438,7 @@
"repeatIteration": null,
"repeatRowId": null,
"showTitle": true,
"title": "Write Path",
"title": "Distributor - Structured Metadata",
"titleSize": "h6"
},
{
@@ -634,7 +634,7 @@
"stack": true,
"targets": [
{
"expr": "sum by (status) (\n label_replace(label_replace(rate(loki_boltdb_shipper_request_duration_seconds_count{cluster=~\"$cluster\",job=~\"($namespace)/(loki|enterprise-logs)-write\", operation=\"WRITE\"}[$__rate_interval]),\n \"status\", \"${1}xx\", \"status_code\", \"([0-9])..\"),\n \"status\", \"${1}\", \"status_code\", \"([a-zA-Z]+)\"))\n",
"expr": "sum by (status) (\n label_replace(label_replace(rate(loki_request_duration_seconds_count{cluster=~\"$cluster\",job=~\"($namespace)/(ingester-zone.*|(loki|enterprise-logs)-write|loki-single-binary)\", route=\"/logproto.Pusher/Push\"}[$__rate_interval]),\n \"status\", \"${1}xx\", \"status_code\", \"([0-9])..\"),\n \"status\", \"${1}\", \"status_code\", \"([a-zA-Z]+)\"))\n",
"format": "time_series",
"legendFormat": "{{status}}",
"refId": "A"
@@ -682,19 +682,895 @@
"span": 6,
"targets": [
{
"expr": "histogram_quantile(0.99, sum(rate(loki_boltdb_shipper_request_duration_seconds_bucket{cluster=~\"$cluster\",job=~\"($namespace)/(loki|enterprise-logs)-write\", operation=\"WRITE\"}[$__rate_interval])) by (le)) * 1e3",
"expr": "histogram_quantile(0.99, sum by (le) (cluster_job_route:loki_request_duration_seconds_bucket:sum_rate{cluster=~\"$cluster\", job=~\"($namespace)/(ingester-zone.*|(loki|enterprise-logs)-write|loki-single-binary)\", route=\"/logproto.Pusher/Push\"})) * 1e3",
"format": "time_series",
"intervalFactor": 2,
"legendFormat": "99th Percentile",
"refId": "A",
"step": 10
},
{
"expr": "histogram_quantile(0.50, sum by (le) (cluster_job_route:loki_request_duration_seconds_bucket:sum_rate{cluster=~\"$cluster\", job=~\"($namespace)/(ingester-zone.*|(loki|enterprise-logs)-write|loki-single-binary)\", route=\"/logproto.Pusher/Push\"})) * 1e3",
"format": "time_series",
"intervalFactor": 2,
"legendFormat": "50th Percentile",
"refId": "B",
"step": 10
},
{
"expr": "1e3 * sum(cluster_job_route:loki_request_duration_seconds_sum:sum_rate{cluster=~\"$cluster\", job=~\"($namespace)/(ingester-zone.*|(loki|enterprise-logs)-write|loki-single-binary)\", route=\"/logproto.Pusher/Push\"}) / sum(cluster_job_route:loki_request_duration_seconds_count:sum_rate{cluster=~\"$cluster\", job=~\"($namespace)/(ingester-zone.*|(loki|enterprise-logs)-write|loki-single-binary)\", route=\"/logproto.Pusher/Push\"})",
"format": "time_series",
"intervalFactor": 2,
"legendFormat": "Average",
"refId": "C",
"step": 10
}
],
"title": "Latency",
"type": "timeseries",
"yaxes": [
{
"format": "ms",
"label": null,
"logBase": 1,
"max": null,
"min": 0,
"show": true
},
{
"format": "short",
"label": null,
"logBase": 1,
"max": null,
"min": null,
"show": false
}
]
}
],
"repeat": null,
"repeatIteration": null,
"repeatRowId": null,
"showTitle": true,
"title": "Ingester - Zone Aware",
"titleSize": "h6"
},
{
"collapse": false,
"height": "250px",
"panels": [
{
"aliasColors": {
"1xx": "#EAB839",
"2xx": "#7EB26D",
"3xx": "#6ED0E0",
"4xx": "#EF843C",
"5xx": "#E24D42",
"OK": "#7EB26D",
"cancel": "#A9A9A9",
"error": "#E24D42",
"success": "#7EB26D"
},
"datasource": "$datasource",
"fieldConfig": {
"defaults": {
"custom": {
"drawStyle": "line",
"fillOpacity": 100,
"lineWidth": 0,
"pointSize": 5,
"showPoints": "never",
"spanNulls": false,
"stacking": {
"group": "A",
"mode": "normal"
}
},
"min": 0,
"thresholds": {
"mode": "absolute",
"steps": [ ]
},
"unit": "short"
},
"overrides": [
{
"matcher": {
"id": "byName",
"options": "1xx"
},
"properties": [
{
"id": "color",
"value": {
"fixedColor": "#EAB839",
"mode": "fixed"
}
}
]
},
{
"matcher": {
"id": "byName",
"options": "2xx"
},
"properties": [
{
"id": "color",
"value": {
"fixedColor": "#7EB26D",
"mode": "fixed"
}
}
]
},
{
"matcher": {
"id": "byName",
"options": "3xx"
},
"properties": [
{
"id": "color",
"value": {
"fixedColor": "#6ED0E0",
"mode": "fixed"
}
}
]
},
{
"matcher": {
"id": "byName",
"options": "4xx"
},
"properties": [
{
"id": "color",
"value": {
"fixedColor": "#EF843C",
"mode": "fixed"
}
}
]
},
{
"matcher": {
"id": "byName",
"options": "5xx"
},
"properties": [
{
"id": "color",
"value": {
"fixedColor": "#E24D42",
"mode": "fixed"
}
}
]
},
{
"matcher": {
"id": "byName",
"options": "OK"
},
"properties": [
{
"id": "color",
"value": {
"fixedColor": "#7EB26D",
"mode": "fixed"
}
}
]
},
{
"matcher": {
"id": "byName",
"options": "cancel"
},
"properties": [
{
"id": "color",
"value": {
"fixedColor": "#A9A9A9",
"mode": "fixed"
}
}
]
},
{
"matcher": {
"id": "byName",
"options": "error"
},
"properties": [
{
"id": "color",
"value": {
"fixedColor": "#E24D42",
"mode": "fixed"
}
}
]
},
{
"matcher": {
"id": "byName",
"options": "success"
},
"properties": [
{
"id": "color",
"value": {
"fixedColor": "#7EB26D",
"mode": "fixed"
}
}
]
}
]
},
"fill": 10,
"id": 7,
"linewidth": 0,
"links": [ ],
"options": {
"legend": {
"showLegend": true
},
"tooltip": {
"mode": "single",
"sort": "none"
}
},
"span": 6,
"stack": true,
"targets": [
{
"expr": "sum by (status) (\n label_replace(label_replace(rate(loki_request_duration_seconds_count{cluster=~\"$cluster\",job=~\"($namespace)/(ingester|(loki|enterprise-logs)-write|loki-single-binary)\", route=\"/logproto.Pusher/Push\"}[$__rate_interval]),\n \"status\", \"${1}xx\", \"status_code\", \"([0-9])..\"),\n \"status\", \"${1}\", \"status_code\", \"([a-zA-Z]+)\"))\n",
"format": "time_series",
"legendFormat": "{{status}}",
"refId": "A"
}
],
"title": "QPS",
"type": "timeseries"
},
{
"datasource": "$datasource",
"fieldConfig": {
"defaults": {
"custom": {
"drawStyle": "line",
"fillOpacity": 10,
"lineWidth": 1,
"pointSize": 5,
"showPoints": "never",
"spanNulls": false,
"stacking": {
"group": "A",
"mode": "none"
}
},
"thresholds": {
"mode": "absolute",
"steps": [ ]
},
"unit": "ms"
},
"overrides": [ ]
},
"id": 8,
"links": [ ],
"nullPointMode": "null as zero",
"options": {
"legend": {
"showLegend": true
},
"tooltip": {
"mode": "single",
"sort": "none"
}
},
"span": 6,
"targets": [
{
"expr": "histogram_quantile(0.99, sum by (le) (cluster_job_route:loki_request_duration_seconds_bucket:sum_rate{cluster=~\"$cluster\", job=~\"($namespace)/(ingester|(loki|enterprise-logs)-write|loki-single-binary)\", route=\"/logproto.Pusher/Push\"})) * 1e3",
"format": "time_series",
"intervalFactor": 2,
"legendFormat": "99th Percentile",
"refId": "A",
"step": 10
},
{
"expr": "histogram_quantile(0.50, sum by (le) (cluster_job_route:loki_request_duration_seconds_bucket:sum_rate{cluster=~\"$cluster\", job=~\"($namespace)/(ingester|(loki|enterprise-logs)-write|loki-single-binary)\", route=\"/logproto.Pusher/Push\"})) * 1e3",
"format": "time_series",
"intervalFactor": 2,
"legendFormat": "50th Percentile",
"refId": "B",
"step": 10
},
{
"expr": "1e3 * sum(cluster_job_route:loki_request_duration_seconds_sum:sum_rate{cluster=~\"$cluster\", job=~\"($namespace)/(ingester|(loki|enterprise-logs)-write|loki-single-binary)\", route=\"/logproto.Pusher/Push\"}) / sum(cluster_job_route:loki_request_duration_seconds_count:sum_rate{cluster=~\"$cluster\", job=~\"($namespace)/(ingester|(loki|enterprise-logs)-write|loki-single-binary)\", route=\"/logproto.Pusher/Push\"})",
"format": "time_series",
"intervalFactor": 2,
"legendFormat": "Average",
"refId": "C",
"step": 10
}
],
"title": "Latency",
"type": "timeseries",
"yaxes": [
{
"format": "ms",
"label": null,
"logBase": 1,
"max": null,
"min": 0,
"show": true
},
{
"format": "short",
"label": null,
"logBase": 1,
"max": null,
"min": null,
"show": false
}
]
}
],
"repeat": null,
"repeatIteration": null,
"repeatRowId": null,
"showTitle": true,
"title": "Ingester",
"titleSize": "h6"
},
{
"collapse": false,
"height": "250px",
"panels": [
{
"aliasColors": {
"1xx": "#EAB839",
"2xx": "#7EB26D",
"3xx": "#6ED0E0",
"4xx": "#EF843C",
"5xx": "#E24D42",
"OK": "#7EB26D",
"cancel": "#A9A9A9",
"error": "#E24D42",
"success": "#7EB26D"
},
"datasource": "$datasource",
"fieldConfig": {
"defaults": {
"custom": {
"drawStyle": "line",
"fillOpacity": 100,
"lineWidth": 0,
"pointSize": 5,
"showPoints": "never",
"spanNulls": false,
"stacking": {
"group": "A",
"mode": "normal"
}
},
"min": 0,
"thresholds": {
"mode": "absolute",
"steps": [ ]
},
"unit": "short"
},
"overrides": [
{
"matcher": {
"id": "byName",
"options": "1xx"
},
"properties": [
{
"id": "color",
"value": {
"fixedColor": "#EAB839",
"mode": "fixed"
}
}
]
},
{
"matcher": {
"id": "byName",
"options": "2xx"
},
"properties": [
{
"id": "color",
"value": {
"fixedColor": "#7EB26D",
"mode": "fixed"
}
}
]
},
{
"matcher": {
"id": "byName",
"options": "3xx"
},
"properties": [
{
"id": "color",
"value": {
"fixedColor": "#6ED0E0",
"mode": "fixed"
}
}
]
},
{
"matcher": {
"id": "byName",
"options": "4xx"
},
"properties": [
{
"id": "color",
"value": {
"fixedColor": "#EF843C",
"mode": "fixed"
}
}
]
},
{
"matcher": {
"id": "byName",
"options": "5xx"
},
"properties": [
{
"id": "color",
"value": {
"fixedColor": "#E24D42",
"mode": "fixed"
}
}
]
},
{
"matcher": {
"id": "byName",
"options": "OK"
},
"properties": [
{
"id": "color",
"value": {
"fixedColor": "#7EB26D",
"mode": "fixed"
}
}
]
},
{
"matcher": {
"id": "byName",
"options": "cancel"
},
"properties": [
{
"id": "color",
"value": {
"fixedColor": "#A9A9A9",
"mode": "fixed"
}
}
]
},
{
"matcher": {
"id": "byName",
"options": "error"
},
"properties": [
{
"id": "color",
"value": {
"fixedColor": "#E24D42",
"mode": "fixed"
}
}
]
},
{
"matcher": {
"id": "byName",
"options": "success"
},
"properties": [
{
"id": "color",
"value": {
"fixedColor": "#7EB26D",
"mode": "fixed"
}
}
]
}
]
},
"fill": 10,
"id": 9,
"linewidth": 0,
"links": [ ],
"options": {
"legend": {
"showLegend": true
},
"tooltip": {
"mode": "single",
"sort": "none"
}
},
"span": 6,
"stack": true,
"targets": [
{
"expr": "sum by (status) (\n label_replace(label_replace(rate(loki_index_request_duration_seconds_count{cluster=~\"$cluster\",job=~\"($namespace)/(ingester.*|(loki|enterprise-logs)-write|loki-single-binary)\", operation=\"index_chunk\"}[$__rate_interval]),\n \"status\", \"${1}xx\", \"status_code\", \"([0-9])..\"),\n \"status\", \"${1}\", \"status_code\", \"([a-zA-Z]+)\"))\n",
"format": "time_series",
"legendFormat": "{{status}}",
"refId": "A"
}
],
"title": "QPS",
"type": "timeseries"
},
{
"datasource": "$datasource",
"fieldConfig": {
"defaults": {
"custom": {
"drawStyle": "line",
"fillOpacity": 10,
"lineWidth": 1,
"pointSize": 5,
"showPoints": "never",
"spanNulls": false,
"stacking": {
"group": "A",
"mode": "none"
}
},
"thresholds": {
"mode": "absolute",
"steps": [ ]
},
"unit": "ms"
},
"overrides": [ ]
},
"id": 10,
"links": [ ],
"nullPointMode": "null as zero",
"options": {
"legend": {
"showLegend": true
},
"tooltip": {
"mode": "single",
"sort": "none"
}
},
"span": 6,
"targets": [
{
"expr": "histogram_quantile(0.99, sum(rate(loki_index_request_duration_seconds_bucket{cluster=~\"$cluster\",job=~\"($namespace)/(ingester.*|(loki|enterprise-logs)-write|loki-single-binary)\", operation=\"index_chunk\"}[$__rate_interval])) by (le)) * 1e3",
"format": "time_series",
"legendFormat": "99th Percentile",
"refId": "A"
},
{
"expr": "histogram_quantile(0.50, sum(rate(loki_boltdb_shipper_request_duration_seconds_bucket{cluster=~\"$cluster\",job=~\"($namespace)/(loki|enterprise-logs)-write\", operation=\"WRITE\"}[$__rate_interval])) by (le)) * 1e3",
"expr": "histogram_quantile(0.50, sum(rate(loki_index_request_duration_seconds_bucket{cluster=~\"$cluster\",job=~\"($namespace)/(ingester.*|(loki|enterprise-logs)-write|loki-single-binary)\", operation=\"index_chunk\"}[$__rate_interval])) by (le)) * 1e3",
"format": "time_series",
"legendFormat": "50th Percentile",
"refId": "B"
},
{
"expr": "sum(rate(loki_boltdb_shipper_request_duration_seconds_sum{cluster=~\"$cluster\",job=~\"($namespace)/(loki|enterprise-logs)-write\", operation=\"WRITE\"}[$__rate_interval])) * 1e3 / sum(rate(loki_boltdb_shipper_request_duration_seconds_count{cluster=~\"$cluster\",job=~\"($namespace)/(loki|enterprise-logs)-write\", operation=\"WRITE\"}[$__rate_interval]))",
"expr": "sum(rate(loki_index_request_duration_seconds_sum{cluster=~\"$cluster\",job=~\"($namespace)/(ingester.*|(loki|enterprise-logs)-write|loki-single-binary)\", operation=\"index_chunk\"}[$__rate_interval])) * 1e3 / sum(rate(loki_index_request_duration_seconds_count{cluster=~\"$cluster\",job=~\"($namespace)/(ingester.*|(loki|enterprise-logs)-write|loki-single-binary)\", operation=\"index_chunk\"}[$__rate_interval]))",
"format": "time_series",
"legendFormat": "Average",
"refId": "C"
}
],
"title": "Latency",
"type": "timeseries",
"yaxes": [
{
"format": "ms",
"label": null,
"logBase": 1,
"max": null,
"min": 0,
"show": true
},
{
"format": "short",
"label": null,
"logBase": 1,
"max": null,
"min": null,
"show": false
}
]
}
],
"repeat": null,
"repeatIteration": null,
"repeatRowId": null,
"showTitle": true,
"title": "Index",
"titleSize": "h6"
},
{
"collapse": false,
"height": "250px",
"panels": [
{
"aliasColors": {
"1xx": "#EAB839",
"2xx": "#7EB26D",
"3xx": "#6ED0E0",
"4xx": "#EF843C",
"5xx": "#E24D42",
"OK": "#7EB26D",
"cancel": "#A9A9A9",
"error": "#E24D42",
"success": "#7EB26D"
},
"datasource": "$datasource",
"fieldConfig": {
"defaults": {
"custom": {
"drawStyle": "line",
"fillOpacity": 100,
"lineWidth": 0,
"pointSize": 5,
"showPoints": "never",
"spanNulls": false,
"stacking": {
"group": "A",
"mode": "normal"
}
},
"min": 0,
"thresholds": {
"mode": "absolute",
"steps": [ ]
},
"unit": "short"
},
"overrides": [
{
"matcher": {
"id": "byName",
"options": "1xx"
},
"properties": [
{
"id": "color",
"value": {
"fixedColor": "#EAB839",
"mode": "fixed"
}
}
]
},
{
"matcher": {
"id": "byName",
"options": "2xx"
},
"properties": [
{
"id": "color",
"value": {
"fixedColor": "#7EB26D",
"mode": "fixed"
}
}
]
},
{
"matcher": {
"id": "byName",
"options": "3xx"
},
"properties": [
{
"id": "color",
"value": {
"fixedColor": "#6ED0E0",
"mode": "fixed"
}
}
]
},
{
"matcher": {
"id": "byName",
"options": "4xx"
},
"properties": [
{
"id": "color",
"value": {
"fixedColor": "#EF843C",
"mode": "fixed"
}
}
]
},
{
"matcher": {
"id": "byName",
"options": "5xx"
},
"properties": [
{
"id": "color",
"value": {
"fixedColor": "#E24D42",
"mode": "fixed"
}
}
]
},
{
"matcher": {
"id": "byName",
"options": "OK"
},
"properties": [
{
"id": "color",
"value": {
"fixedColor": "#7EB26D",
"mode": "fixed"
}
}
]
},
{
"matcher": {
"id": "byName",
"options": "cancel"
},
"properties": [
{
"id": "color",
"value": {
"fixedColor": "#A9A9A9",
"mode": "fixed"
}
}
]
},
{
"matcher": {
"id": "byName",
"options": "error"
},
"properties": [
{
"id": "color",
"value": {
"fixedColor": "#E24D42",
"mode": "fixed"
}
}
]
},
{
"matcher": {
"id": "byName",
"options": "success"
},
"properties": [
{
"id": "color",
"value": {
"fixedColor": "#7EB26D",
"mode": "fixed"
}
}
]
}
]
},
"fill": 10,
"id": 11,
"linewidth": 0,
"links": [ ],
"options": {
"legend": {
"showLegend": true
},
"tooltip": {
"mode": "single",
"sort": "none"
}
},
"span": 6,
"stack": true,
"targets": [
{
"expr": "sum by (status) (\n label_replace(label_replace(rate(loki_boltdb_shipper_request_duration_seconds_count{cluster=~\"$cluster\",job=~\"($namespace)/(ingester|(loki|enterprise-logs)-write|loki-single-binary)\", operation=\"WRITE\"}[$__rate_interval]),\n \"status\", \"${1}xx\", \"status_code\", \"([0-9])..\"),\n \"status\", \"${1}\", \"status_code\", \"([a-zA-Z]+)\"))\n",
"format": "time_series",
"legendFormat": "{{status}}",
"refId": "A"
}
],
"title": "QPS",
"type": "timeseries"
},
{
"datasource": "$datasource",
"fieldConfig": {
"defaults": {
"custom": {
"drawStyle": "line",
"fillOpacity": 10,
"lineWidth": 1,
"pointSize": 5,
"showPoints": "never",
"spanNulls": false,
"stacking": {
"group": "A",
"mode": "none"
}
},
"thresholds": {
"mode": "absolute",
"steps": [ ]
},
"unit": "ms"
},
"overrides": [ ]
},
"id": 12,
"links": [ ],
"nullPointMode": "null as zero",
"options": {
"legend": {
"showLegend": true
},
"tooltip": {
"mode": "single",
"sort": "none"
}
},
"span": 6,
"targets": [
{
"expr": "histogram_quantile(0.99, sum(rate(loki_boltdb_shipper_request_duration_seconds_bucket{cluster=~\"$cluster\",job=~\"($namespace)/(ingester|(loki|enterprise-logs)-write|loki-single-binary)\", operation=\"WRITE\"}[$__rate_interval])) by (le)) * 1e3",
"format": "time_series",
"legendFormat": "99th Percentile",
"refId": "A"
},
{
"expr": "histogram_quantile(0.50, sum(rate(loki_boltdb_shipper_request_duration_seconds_bucket{cluster=~\"$cluster\",job=~\"($namespace)/(ingester|(loki|enterprise-logs)-write|loki-single-binary)\", operation=\"WRITE\"}[$__rate_interval])) by (le)) * 1e3",
"format": "time_series",
"legendFormat": "50th Percentile",
"refId": "B"
},
{
"expr": "sum(rate(loki_boltdb_shipper_request_duration_seconds_sum{cluster=~\"$cluster\",job=~\"($namespace)/(ingester|(loki|enterprise-logs)-write|loki-single-binary)\", operation=\"WRITE\"}[$__rate_interval])) * 1e3 / sum(rate(loki_boltdb_shipper_request_duration_seconds_count{cluster=~\"$cluster\",job=~\"($namespace)/(ingester|(loki|enterprise-logs)-write|loki-single-binary)\", operation=\"WRITE\"}[$__rate_interval]))",
"format": "time_series",
"legendFormat": "Average",
"refId": "C"

View File

@@ -1,4 +1,3 @@
groups:
- name: "loki_rules"
rules:
- expr: "histogram_quantile(0.99, sum(rate(loki_request_duration_seconds_bucket[5m]))
@@ -50,4 +49,4 @@ groups:
record: "cluster_namespace_job_route:loki_request_duration_seconds_sum:sum_rate"
- expr: "sum(rate(loki_request_duration_seconds_count[5m])) by (cluster, namespace,
job, route)"
record: "cluster_namespace_job_route:loki_request_duration_seconds_count:sum_rate"
record: "cluster_namespace_job_route:loki_request_duration_seconds_count:sum_rate"

View File

@@ -1,12 +1,11 @@
# Specify the namespaces to monitor here
namespacesToMonitor:
- loki
- mimir
- tempo
# The name of the cluster where this will be installed
clusterLabelValue: "meta-monitoring"
# Set to true to write logs, metrics or traces to Grafana Cloud
# The secrets have to be created first
cloud:
logs:
enabled: true

View File

@@ -0,0 +1,10 @@
# Create a new release
1. Update the version field in charts/meta-monitoring/Chart.yaml in a new PR. Merge this PR if approved.
2. On the [Actions tab](https://github.com/grafana/meta-monitoring-chart/actions):
- Select `Release Helm chart` in the workflows on the left
- Click the `Run workflow` button
- Leave the `main` branch as is
- Click the green `Run workflow` button

View File

@@ -1,5 +1,19 @@
# Install this chart
## Preparation for Cloud mode (preferred)
1. Use an existing Grafana Cloud account or setup a new one. Then create an access token:
1. In Grafana go to Administration -> Users and Access -> Cloud access policies.
1. Click `Create access policy`.
1. Fill in the `Display name` field and check the `Write` check box for metrics, logs and traces. Then click `Create`.
1. On the newly created access policy click `Add token`.
1. Fill in the `Token name` field and click `Create`. Make a copy of the token as it will be used later on.
1. Create the meta namespace
```
@@ -11,36 +25,142 @@
```
kubectl create secret generic logs -n meta \
--from-literal=username=<logs username> \
--from-literal=password=<logs password>
--from-literal=password=<token>
--from-literal=endpoint='https://logs-prod-us-central1.grafana.net/loki/api/v1/push'
kubectl create secret generic metrics -n meta \
--from-literal=username=<metrics username> \
--from-literal=password=<metrics password>
--from-literal=password=<token>
--from-literal=endpoint='https://prometheus-us-central1.grafana.net/api/prom/push'
kubectl create secret generic traces -n meta \
--from-literal=username=<traces username> \
--from-literal=password=<traces password>
--from-literal=password=<token>
--from-literal=endpoint='https://tempo-us-central1.grafana.net/tempo'
```
1. Create a values.yaml file based on the [default one](../charts/meta-monitoring/values.yaml). Fill in the names of the secrets created above as needed.
The logs, metrics and traces usernames are the `User / Username / Instance IDs` of the Loki, Prometheus/Mimir and Tempo instances in Grafana Cloud. From `Home` in Grafana click on `Stacks`. Then go to the `Details` pages of Loki, Prometheus/Mimir and Tempo.
1. Create a values.yaml file based on the [default one](../charts/meta-monitoring/values.yaml). Fill in the names of the secrets created above as needed. An example minimal values.yaml looks like this:
```
namespacesToMonitor:
- loki
cloud:
logs:
enabled: true
secret: "logs"
metrics:
enabled: true
secret: "metrics"
traces:
enabled: true
secret: "traces"
```
## Preparation for Local mode
1. Create the meta namespace
```
kubectl create namespace meta
```
1. Create a values.yaml file based on the [default one](../charts/meta-monitoring/values.yaml). An example minimal values.yaml looks like this:
```
namespacesToMonitor:
- loki
cloud:
logs:
enabled: false
metrics:
enabled: false
traces:
enabled: false
local:
grafana:
enabled:true
logs:
enabled: true
metrics:
enabled: true
traces:
enabled: true
minio:
enabled: true
```
## Installing the chart
1. Add the repo
```
helm repo add grafana https://grafana.github.io/helm-charts
```
1. Fetch the latest charts from the grafana repo
```
helm repo update grafana
```
1. Install this helm chart
```
helm install -n meta -f values.yaml meta ./charts/meta-monitoring
helm install -n meta -f values.yaml meta grafana/meta-monitoring
```
1. Upgrade
```
helm upgrade --install -f values.yaml -n meta meta ./charts/meta-monitoring
helm upgrade --install -f values.yaml -n meta meta grafana/meta-monitoring
```
1. Delete this chart:
```
helm delete -n meta meta
```
```
## Installing the dashboards and rules on Grafana Cloud
## Installing the dashboards on Grafana Cloud
Only the files for the application monitored have to be copied. When monitoring Loki import dashboard files starting with 'loki-'.
For each of the dashboard files in charts/meta-monitoring/src/dashboards folder do the following:
1. Click on 'Dashboards' in Grafana
1. Click on the 'New` button and select 'Import'
1. Drop the dashboard file to the 'Upload dashboard JSON file' drop area
1. Click 'Import'
## Installing the rules on Grafana Cloud
1. Select the rules files in charts/meta-monitoring/src/rules for the application to monitor. When monitoring Loki use loki-rules.yaml.
1. Install mimirtool as per the [instructions](https://grafana.com/docs/mimir/latest/manage/tools/mimirtool/)
1. Create an access policy with Read and Write permission for Rules. Also create a token and record the token.
1. Get your cloud Prometheus endpoint and Instance ID from the `Prometheus` page in `Stacks`.
1. Use them to load the rules using mimirtool as follows:
```
mimirtool rules load --address=<your_cloud_prometheus_endpoint> --id=<your_instance_id> --key=<your_cloud_access_policy_token> *.yaml
```
1. To check the rules you have uploaded run:
```
mimirtool rules print --address=<your_cloud_prometheus_endpoint> --id=<your_instance_id> --key=<your_cloud_access_policy_token>
```

20
scripts/clone_loki_mixin.sh Executable file
View File

@@ -0,0 +1,20 @@
#!/usr/bin/env bash
clean_up() {
test -d "$tmp_dir" && rm -fr "$tmp_dir"
}
here=${PWD}
tmp_dir=$( mktemp -d -t my-script )
cd $tmp_dir
echo "Cloning Loki"
git clone --filter=blob:none --no-checkout "https://github.com/grafana/loki"
cd loki
git sparse-checkout init --cone
git checkout main
git sparse-checkout set production/loki-mixin
echo "Copying production/loki-mixin to ${here}"
cp -r production ${here}

View File

@@ -0,0 +1,18 @@
(import 'dashboards.libsonnet') +
(import 'alerts.libsonnet') +
(import 'recording_rules.libsonnet') + {
grafanaDashboardFolder: 'Loki Meta Monitoring',
_config+:: {
internal_components: false,
// The Meta Monitoring helm chart uses Grafana Alloy instead of promtail
promtail+: {
enabled: false,
},
meta_monitoring+: {
enabled: true,
},
},
}