Jump to: navigation, search

Monasca/Logging

This page documents the Monasca Logging solution that is in progress.



Note:
Currently we only have this page to discuss our proposals. For that reason some "metric related stuff" can be found here too. We will move or remove it whenever the discussion is finished.

Log Data Flow

The following diagram visualizes the integration of logs in the processing pipeline of Monasca. We indicated some short cuts we want to take as a first step. Also, we indicated some advanced functionality (multi-tennancy) that we plan for the future.

bigpx


Log Client

Monasca Agent

Collect and forward Log Data

The agent should be extended to collect log data and forward them to the Monasca Log API. Implement or integrate a Logstash-like collector (e.g. Beaver). Beaver is a lightweight python log file shipper that is used to send logs to an intermediate broker for further processing by Logstash.

Plugins

Logstash Output Plugin

https://github.com/FujitsuEnablingSoftwareTechnologyGmbH/logstash-output-monasca_api

Other Output Plugins

TODO e.g. for fluentd...

Log Management Backend

Integrate ELK stack into the existing Monasca architecture. Receiving logs, authentication, processing and storing of logs.

Monasca Log API

https://github.com/openstack/monasca-log-api

Name

Note:
Deviations Monasca Metric API documentation (https://github.com/stackforge/monasca-api/blob/master/docs/monasca-api-spec.md) and implementation (https://github.com/stackforge/monasca-api):

  • Document: Nothing is mentioned about the allowed characters in the name
  • Implementation: Characters in the name are restricted to a-z A-Z 0-0 _ . -

The implementation of the Log API follows the implementation of the Metric API!

Dimensions

Note:
Deviations Monasca Metric API documentation (https://github.com/stackforge/monasca-api/blob/master/docs/monasca-api-spec.md) and implementation (https://github.com/stackforge/monasca-api):

  • Document: The first character in the dimension is restricted to the following: a-z A-Z 0-9 _ / \ $. However, the next characters may be any character except for the following: ; } { = , & ) ( ".
  • Implementation: Characters in the dimensions key are restricted to a-z A-Z 0-0 _ . –

The implementation of the Log API follows the implementation of the Metric API!

Request Line
  • POST /v2.0/log/single - Endpoint for single and multiline logs; maximum log size: 1MB
  • POST /v2.0/log/bulk - Endpoint for bulk logs (the logs must be line-separated); maximum log size: 5MB (TODO)

Request Headers
  • Content-Type (string, required) - application/json
  • X-Auth-Token (string, required) - Keystone authentication token
  • X-Application-Type: (string, optional) - Type of application (to have a hint how to parse)
  • X-Dimensions: (string, optional) - A dictionary of (key, value) pairs to structure the logs and help later on with filtering (dashboard)


Timestamps:
The priority how to determine the timestamp:
1) Try to parse timestamp from original log data
2) If that doesn't work:

  • RESTful API: Use receiving time as timestamp
  • future version: Syslog API: Take timestamp from syslog

Monasca operates with UTC timezone, that means the timestamp is convert to the concurrent time in UTC.

Request Body

The original (potentially unstructured) log data.

Request Examples

Single Log - JSON

POST /v2.0/log/single HTTP/1.1
Content-Type: application/json
X-Auth-Token: 27feed73a0ce4138934e30d619b415b0
X-Application-Type: apache
X-Dimensions: applicationname:WebServer01,environment:production

{"message":"Hello World!", "from":"hoover"}


Bulk of Logs - Plain Text (TODO)

POST /v2.0/log/bulk HTTP/1.1
Content-Type: text/plain
X-Auth-Token: 27feed73a0ce4138934e30d619b415b0
X-Application-Type: apache
X-Dimensions: applicationname:WebServer01,environment:production

Hello\nWorld
Response Status Code

204 - No Content

Response Body

This request does not return a response body.

Log Transformer

Consumes logs from Kafka, transforms them, and publishes to Kafka.

Log Persister

Consumes logs from Kafka, prepares them for bulk storage, and stores them into Elasticsearch.

Log Management Frontend

Monasca Log API

Visualization of logs.

TODO





Off-Topic. Please ignore

Metric related and other Stuff

Tags

Introduce "tags" like in the proposal of the Log API. The tags will be needed for the container monitoring where the workload/applications are very dynamic and are not related to a static host anymore (e.g. kubernetes/docker cluster).

Monasca Agent

Additionally needed functionality.

The Monasca Agent should monitor the OpenStack Hypervisor (basic VM metrics like CPU, RAM only)

Sources:

  • Nova

Supported by agent?

  • KVM

Supported by agent?

  • VMware vSphere

vSphere is not supported by agent. A new plugin must be developed (blueprint...).

  • VMware ESXi

Probably not supported by agent?

User:

  • OpenStack administrator
  • OpenStack user (multi tenancy! Can agent add the tenant ID per VM?)

Monasca Ceilometer Plugin should forward Metrics

Data Flow: Hypervisor --> Nova --> Ceilometer --> Monasca-Ceilometer-Plugin --> Monasca

  • KVM

KVM sends metrics to Nova, but Ceilometer (currently?) doesn't poll that data. Monasca-Ceilometer-Plugin is not in "product status", that means no metrics are forwarded to Monasca.

  • VMware vSphere

Although Nova provides support for VMware vCenter Server, Ceilometer does not. It does not collect metrics of virtual machines deployed on VMware virtualization platform. It will be necessary to extend this support in Ceilometer. The aim is to implement a polling agent that queries samples from VMware vCenter Server.
Ceilometer/blueprints/vmware-vcenter-server

OpenStack Hypervisor/VM Monitoring

The Monasca Agent should monitor each created VM (logs and metrics)
  • Install agent (automatically) on each created VM.

Metrics as well as logs can be forwarded to Monasca.

Windows Support

The Monasca agent should run on Windows also.

Note: The Monasca-agent is based on DataDog. DataDog supports Windows since November 9th, 2012, but currently the Monasca agent runs on Ubuntu only.