This page documents the Monasca Logging solution that is in progress.
- 1 Log Management - Client Side
- 2 Log Management - Server Side - Consuming Logs
- 3 Log Management - Server Side - Visualizing Logs
- 4 Log Data Flow
Log Management - Client Side
Monasca Log Agent - Logstash
Monitors one or more log files, adds meta information (e.g. dimensions), authenticates with KeyStone and sends the logs (in a bulk) to the Monasca Log API.
Base technology: Logstash
Monasca Log Agent - Beaver
Monitors one file, adds meta information (e.g. dimensions), authenticates with KeyStone and sends the logs (in a bulk) to the Monasca Log API.
Log Management - Server Side - Consuming Logs
Monasca Log API
Consumes logs from the agents, authorizes them and publishes them to Kafka.
Monasca Log Transformer
Consumes logs from Kafka, transforms them, and publishes them to Kafka.
Monasca Log Persister
Consumes logs from Kafka, prepares them for bulk storage, and stores them into Elasticsearch.
Monasca Log Metrics
Consumes logs from Kafka, creates metrics for logs with severity CRITICAL, ERROR, WARNING, and publishes them to Kafka.
Monasca Log Storage
All logs are stored in Elasticsearch.
Log Management - Server Side - Visualizing Logs
Monasca Kibana Server
Authorization with KeyStone and visualization of logs (stored in elasticsearch).
Base technology: Kibana
Log Data Flow
The following diagram visualizes the integration of logs in the processing pipeline of Monasca. We indicated some short cuts we want to take as a first step. Also, we indicated some advanced functionality (multi-tenancy) that we plan for the future.