Loki

  • Canonical Observability
Channel Revision Published Runs on
latest/stable 160 10 Sep 2024
Ubuntu 20.04
latest/candidate 161 19 Nov 2024
Ubuntu 20.04
latest/beta 177 19 Nov 2024
Ubuntu 20.04
latest/edge 177 16 Nov 2024
Ubuntu 20.04
1.0/stable 104 12 Dec 2023
Ubuntu 20.04
1.0/candidate 104 22 Nov 2023
Ubuntu 20.04
1.0/beta 104 22 Nov 2023
Ubuntu 20.04
1.0/edge 104 22 Nov 2023
Ubuntu 20.04
juju deploy loki-k8s --channel 1.0/edge
Show information

Platform:

After relating loki to other charms, you may encounter situations where log lines appear to be missing.

Checklist

  • Source of log files is related to loki.
  • Loki url (from grafana-agent or promtail config files) is reachable from the source container.

Check status

You can curl the loki unit IP for status. In the sample output below, the ingester isn’t ready yet.

❯ curl 10.1.166.94:3100/ready
Ingester not ready: waiting for 15s after being ready

❯ curl 10.1.166.94:3100/services
querier => Running
query-frontend-tripperware => Running
ring => Running
query-scheduler => Running
query-frontend => Running
ingester-querier => Running
compactor => Running
ruler => Running
ingester => Running
distributor => Running
server => Running
memberlist-kv => Running
analytics => Running
store => Running
cache-generation-loader => Running
query-scheduler-ring => Running

Confirm if Loki received anything at all

You can curl the loki unit IP for labels and alerts.

❯ curl 10.1.166.94:3100/loki/api/v1/labels
{"status":"success"}

❯ curl 10.1.166.94:3100/loki/api/v1/labels
{"status":"success","data":["filename","job","juju_application","juju_charm","juju_model","juju_model_uuid","juju_unit"]}

❯ curl 10.1.166.94:3100/loki/api/v1/label/juju_unit/values
{"status":"success","data":["pg/0"]}

❯ curl 10.1.166.94:3100/loki/api/v1/rules
no rule groups found

Now that you know which labels exist, you can retrieve some logs:

❯ curl -sG 10.1.166.94:3100/loki/api/v1/query_range --data-urlencode 'query={juju_unit="pg/0"}' | jq '.data.result[0]'

You can query for the average logging rate. In the sample below, it is 0.1 log lines per second (6 log lines per minute).

❯ curl -sG 10.1.166.94:3100/loki/api/v1/query --data-urlencode 'query=rate({job=~".+"}[10m])' | jq '.data.result'
[
  {
    "metric": {
      "filename": "/var/log/postgresql/patroni.log",
      "job": "juju_test-bundle-iwfn_f427ffe2_pg",
      "juju_application": "pg",
      "juju_charm": "postgresql-k8s",
      "juju_model": "test-bundle-iwfn",
      "juju_model_uuid": "f427ffe2-9d96-482c-80c4-f200a20eb1bd",
      "juju_unit": "pg/0"
    },
    "value": [
      1715247333.466,
      "0.1"
    ]
  }
]

Query for particular log lines

If only a subset of logs is missing, you can confirm their existence in Loki by filtering labels and/or content. In the sample below loki is queried for log lines that contain “leader”.

❯ curl -sG 10.1.166.94:3100/loki/api/v1/query --data-urlencode 'query=({job=~".+"} |= "leader")' | jq '.data.result'
[
  {
    "stream": {
      "filename": "/var/log/postgresql/patroni.log",
      "job": "juju_test-bundle-iwfn_f427ffe2_pg",
      "juju_application": "pg",
      "juju_charm": "postgresql-k8s",
      "juju_model": "test-bundle-iwfn",
      "juju_model_uuid": "f427ffe2-9d96-482c-80c4-f200a20eb1bd",
      "juju_unit": "pg/0"
    },
    "values": [
      [
        "1715258886211320804",
        "2024-05-09 12:48:06 UTC [15]: INFO: no action. I am (pg-0), the leader with the lock "
      ],
      [
        "1715258876184953745",
        "2024-05-09 12:47:56 UTC [15]: INFO: no action. I am (pg-0), the leader with the lock "
      ],
      [
        "1715258866412113833",
        "2024-05-09 12:47:46 UTC [15]: INFO: no action. I am (pg-0), the leader with the lock "
      ]
    ]
  }
]

List active loggers

To obtain a list of all sources that logged something recently,

❯ curl -sG 10.1.166.94:3100/loki/api/v1/query --data-urlencode 'query=count_over_time({filename=~".+"}[1m]) > 2' | jq '.data.result'
[
  {
    "metric": {
      "filename": "/var/log/postgresql/patroni.log",
      "job": "juju_test-bundle-iwfn_f427ffe2_pg",
      "juju_application": "pg",
      "juju_charm": "postgresql-k8s",
      "juju_model": "test-bundle-iwfn",
      "juju_model_uuid": "f427ffe2-9d96-482c-80c4-f200a20eb1bd",
      "juju_unit": "pg/0"
    },
    "value": [
      1715249068.007,
      "6"
    ]
  }
]

Logs pushed by grafana-agent or promtail

Confirm that logs are being sent out:

# grafana-agent
juju ssh grafana-agent/0 curl localhost:12345/metrics | grep "promtail_sent_"

# promtail
juju ssh mysql-router/0 curl localhost:9080/metrics | grep -E "promtail_read_|promtail_sent_"

If the values are zero (or constant for quite some time), make sure the monitored log files exist and are not empty.

References


Help improve this document in the forum (guidelines). Last updated 4 months ago.