Monasca/Logging
This page documents the Monasca Logging solution that is in progress.
Note:
Currently we only have this page to discuss our proposals. For that reason some "metric related stuff" can be found here too. We will move or remove it whenever the discussion is finished.
Log Data Flow
The following diagram visualizes the integration of logs in the processing pipeline of Monasca. We indicated some short cuts we want to take as a first step. Also, we indicated some advanced functionality (multi-tennancy) that we plan for the future.
Log Client
Monasca Agent
Collect and forward Log Data
The agent should be extended to collect log data and forward them to the Monasca Log API. Implement or integrate a Logstash-like collector (e.g. Beaver). Beaver is a lightweight python log file shipper that is used to send logs to an intermediate broker for further processing by Logstash.
Plugins
Logstash Output Plugin
https://github.com/FujitsuEnablingSoftwareTechnologyGmbH/logstash-output-monasca_api
Other Output Plugins
TODO e.g. for fluentd...
Log Management Backend
Integrate ELK stack into the existing Monasca architecture. Receiving logs, authentication, processing and storing of logs.
- https://github.com/FujitsuEnablingSoftwareTechnologyGmbH/kibana
- https://github.com/FujitsuEnablingSoftwareTechnologyGmbH/ansible-monasca-log-schema
- https://github.com/FujitsuEnablingSoftwareTechnologyGmbH/ansible-monasca-elkstack
Monasca Log API
https://github.com/openstack/monasca-log-api
Name
Note:
Deviations Monasca Metric API documentation (https://github.com/stackforge/monasca-api/blob/master/docs/monasca-api-spec.md) and implementation (https://github.com/stackforge/monasca-api):
- Document: Nothing is mentioned about the allowed characters in the name
- Implementation: Characters in the name are restricted to a-z A-Z 0-0 _ . -
The implementation of the Log API follows the implementation of the Metric API!
Dimensions
Note:
Deviations Monasca Metric API documentation (https://github.com/stackforge/monasca-api/blob/master/docs/monasca-api-spec.md) and implementation (https://github.com/stackforge/monasca-api):
- Document: The first character in the dimension is restricted to the following: a-z A-Z 0-9 _ / \ $. However, the next characters may be any character except for the following: ; } { = , & ) ( ".
- Implementation: Characters in the dimensions key are restricted to a-z A-Z 0-0 _ . –
The implementation of the Log API follows the implementation of the Metric API!
Request Line
- POST /v2.0/log/single - Endpoint for single and multiline logs; maximum log size: 1MB
- POST /v2.0/log/bulk - Endpoint for bulk logs (the logs must be line-separated); maximum log size: 5MB (TODO)
Request Headers
- Content-Type (string, required) - application/json
- X-Auth-Token (string, required) - Keystone authentication token
- X-Application-Type: (string, optional) - Type of application (to have a hint how to parse)
- X-Dimensions: (string, optional) - A dictionary of (key, value) pairs to structure the logs and help later on with filtering (dashboard)
Timestamps:
The priority how to determine the timestamp:
1) Try to parse timestamp from original log data
2) If that doesn't work:
- RESTful API: Use receiving time as timestamp
- future version: Syslog API: Take timestamp from syslog
Monasca operates with UTC timezone, that means the timestamp is convert to the concurrent time in UTC.
Request Body
The original (potentially unstructured) log data.
Request Examples
Single Log - JSON
POST /v2.0/log/single HTTP/1.1 Content-Type: application/json X-Auth-Token: 27feed73a0ce4138934e30d619b415b0 X-Application-Type: apache X-Dimensions: applicationname:WebServer01,environment:production {"message":"Hello World!", "from":"hoover"}
Bulk of Logs - Plain Text (TODO)
POST /v2.0/log/bulk HTTP/1.1 Content-Type: text/plain X-Auth-Token: 27feed73a0ce4138934e30d619b415b0 X-Application-Type: apache X-Dimensions: applicationname:WebServer01,environment:production Hello\nWorld
Response Status Code
204 - No Content
Response Body
This request does not return a response body.
Log Transformer
Consumes logs from Kafka, transforms them, and publishes to Kafka.
Log Persister
Consumes logs from Kafka, prepares them for bulk storage, and stores them into Elasticsearch.
Log Management Frontend
Monasca Log API
Visualization of logs.