Jump to: navigation, search

Monasca/Logging

This page documents the Monasca Logging solution that is in progress.

Log Management - Client Side

Monasca Log Agent - Logstash

Monitors one or more log files, adds meta information (e.g. dimensions), authenticates with KeyStone and sends the logs (in a bulk) to the Monasca Log API. Base technology: logstash

Plugin: https://github.com/FujitsuEnablingSoftwareTechnologyGmbH/logstash-output-monasca_log_api

Monasca Log Agent - Beaver

https://github.com/python-beaver/python-beaver

Log Management - Server Side - Consuming Logs

Monasca Log API

Consumes logs from the agents, authorizes them and publishes to Kafka.

https://github.com/openstack/monasca-log-api

https://github.com/openstack/monasca-log-api/tree/master/docs

Monasca Log Transformer

Consumes logs from Kafka, transforms them, and publishes to Kafka.

Monasca Log Persister

Consumes logs from Kafka, prepares them for bulk storage, and stores them into Elasticsearch.

Monasca Log Metrics

Consumes logs from Kafka, creates metrics for logs with severity CRITICAL, ERROR, WARNING, and publishes to Kafka.

Monasca Log Storage

All logs are stored in Elasticsearch.

Log Management - Server Side - Visualizing Logs

Monasca Kibana Server

Authorization with KeyStone and visualization of logs (stored in elasticsearch). Base technology: Kibana Plugins:

https://github.com/FujitsuEnablingSoftwareTechnologyGmbH/fts-keystone

https://github.com/FujitsuEnablingSoftwareTechnologyGmbH/keystone-v3-client


Log Data Flow

TODO: must be updated!

The following diagram visualizes the integration of logs in the processing pipeline of Monasca. We indicated some short cuts we want to take as a first step. Also, we indicated some advanced functionality (multi-tenancy) that we plan for the future.

bigpx