Monasca/Logging

This page documents the Monasca Logging solution that is in progress.

Monasca Log Agent - Logstash
Monitors one or more log files, adds meta information (e.g. dimensions), authenticates with KeyStone and sends the logs (in a bulk) to the Monasca Log API.

Base technology: Logstash

Plugin: https://github.com/logstash-plugins/logstash-output-monasca_log_api

Monasca Log Agent - Beaver
Monitors one file, adds meta information (e.g. dimensions), authenticates with KeyStone and sends the logs (in a bulk) to the Monasca Log API.

https://github.com/python-beaver/python-beaver/pull/406

https://github.com/python-beaver/python-beaver

Monasca Log API
Consumes logs from the agents, authorizes them and publishes them to Kafka.

https://github.com/openstack/monasca-log-api

https://github.com/openstack/monasca-log-api/tree/master/docs

Monasca Log Transformer
Consumes logs from Kafka, transforms them, and publishes them to Kafka.

Monasca Log Persister
Consumes logs from Kafka, prepares them for bulk storage, and stores them into Elasticsearch.

Monasca Log Metrics
Consumes logs from Kafka, creates metrics for logs with severity CRITICAL, ERROR, WARNING, and publishes them to Kafka.

Monasca Log Storage
All logs are stored in Elasticsearch.

Monasca Kibana Server
Authorization with KeyStone and visualization of logs (stored in elasticsearch).

Base technology: Kibana

Plugins:

https://github.com/FujitsuEnablingSoftwareTechnologyGmbH/fts-keystone

https://github.com/FujitsuEnablingSoftwareTechnologyGmbH/keystone-v3-client

Log Data Flow
The following diagram visualizes the integration of logs in the processing pipeline of Monasca. We indicated some short cuts we want to take as a first step. Also, we indicated some advanced functionality (multi-tenancy) that we plan for the future.