Data Processing in Cisco Observability Platform

0
415
Data Processing in Cisco Observability Platform


Process Vast Amounts of MELT Data

Cisco Observability Platform s designed to ingest and course of huge quantities of MELT (Metrics, Events, Logs and Traces) knowledge. It is constructed on prime of open requirements like OpenTelemetry to make sure interoperability.

What units it aside is its provision of extensions, empowering our companions and prospects to tailor each aspect of its performance to their distinctive wants. Our focus right now is unveiling the intricacies of customizations particularly tailor-made for knowledge processing. It is anticipated that you’ve an understanding of the platform fundamentals, like Flexible Metadata Model (FMM) and resolution growth. Let’s dive in!

The knowledge processing pipeline has numerous phases that result in knowledge storage. As MELT knowledge strikes via the pipeline, it’s processed, remodeled, and enriched, and finally lands within the knowledge retailer the place it may be queried with Unified Query Language (UQL):

Data observability

Each stage marked with a gear icon permits customization of particular logic. Furthermore, the platform allows the creation of completely customized post-processing logic when knowledge can not be altered.

To streamline customization whereas sustaining flexibility, we’re embracing a brand new strategy: workflows, faucets, and plugins, using the CNCF Serverless Workflow specification with JSONata because the default expression language. Since Serverless Workflows are designed utilizing open requirements, we’re extensively using CloudEvents and OpenAPI specs. By leveraging these open requirements, we guarantee compatibility and ease of growth.

Data processing phases that permit knowledge mutation are known as faucets, and their customizations plugins. Each faucet declares an enter and output JSON schema for its plugins. Plugins are anticipated to provide an output that adheres to the faucet’s output schema. A faucet is answerable for merging outputs from all its plugins and producing a brand new occasion, which is a modified model of an unique occasion. Taps can solely be authored by the platform, whereas plugins may be created by any resolution in addition to common customers of the platform.

Workflows are meant for post-processing and thus can solely subscribe to triggers (see beneath). Workflow use instances vary from easy occasion counting to classy machine studying mannequin inferences. Anyone can writer workflows.

This abstraction permits builders to motive when it comes to a single occasion, with out exposing the complexity of the underlying stream processing, and use acquainted properly documented requirements, each of which decrease the barrier of entry.

Each knowledge processing stage communicates with different phases through occasions, which permits us to decouple shoppers and producers and seamlessly rearrange the phases ought to the necessity come up.
Each occasion has an related class, which determines whether or not a selected stage can subscribe to or publish that occasion. There are two public classes for data-related occasions:

  • knowledge:statement – a class of occasions with publish-only permissions which may be considered side-effects of processing the unique occasion, for instance, an entity derived from useful resource attributes in OpenTelemetry metric packet. Observations are indicated with upward ‘publish’ arrows within the above diagram. Taps, workflows and plugins can all produce observations. Observations can solely be subscribed to by particular faucets.
  • knowledge:set off – subscribe-only occasions which might be emitted after all of the mutations have accomplished. Triggers are indicated with a lightning ‘trigger’ icon within the above diagram. Only workflows (post-processing logic) can subscribe to triggers, and solely particular faucets can publish them.

There are 5 statement occasion sorts within the platform:

  • entity.noticed – FMM entity was found whereas processing some knowledge. It is usually a new entity or an replace to an present entity. Each replace from the identical supply totally replaces the earlier one.
  • affiliation.noticed – FMM affiliation was found whereas processing some knowledge. Depending on the cardinality of the affiliation the replace logic differs
  • extension.noticed – FMM extension attributes have been found whereas processing some knowledge. A goal entity should exist already.
  • measurement.acquired – a measurement occasion which contributes to a selected FMM metric. These measurements will probably be aggregated right into a metric in Metric aggregation faucet. Aggregation logic will depend on the metric’s content material kind.
  • occasion.acquired – raises a brand new FMM occasion. This occasion will even be processed by the Event processing faucet, identical to externally ingested occasions.

There are 3 set off occasion sorts within the platform, one for every knowledge variety: metric.enriched, occasion.enriched, hint.encriched. All three occasions are emitted from the ultimate ‘Tag enrichment’ faucet.

Each occasion is registered in a platform’s data retailer, in order that they’re simply discoverable. To listing all accessible occasions, merely use fsoc to question them, i.e., to get all triggers:

fsoc data get --type=contracts:cloudevent --filter="knowledge.class eq 'knowledge:set off'" --layer-type=TENANT

Note that each one occasion sorts are versioned to permit for evolution and are certified with platform resolution identifier for isolation. For instance, a completely certified id of measurement.acquired occasion is platform:measurement.acquired.v1

Let’s illustrate the above ideas with an easy instance. Consider a workflow designed to depend well being rule violations for Kubernetes workloads and APM providers. The logic of the workflow may be damaged down into a number of steps:

  1. Subscribe to the set off occasion
  2. Validate occasion kind and entity relevance
  3. Publish a measurement occasion counting violations whereas retaining severity

Development Tools

Developers can make the most of numerous instruments to help in workflow growth, corresponding to web-based editors or IDEs.

It’s essential to make sure expressions and logic are legitimate via unit assessments and validation in opposition to outlined schemas.

To assist in that, you may write unit assessments by using said, see an instance for this workflow.
Online JSONata editor may also be a useful instrument in writing your expressions.

A weblog on workflow testing is coming quickly!

Step by Step Guide

Create the workflow DSL

Provide a singular identifier and a reputation on your workflow:

id: violations-counter
model: '1.0.0'
specVersion: '0.8'
identify: Violations Counter

Find the set off occasion

Let’s question our set off utilizing fsoc:

fsoc data get --type=contracts:cloudevent --object-id=platform:occasion.enriched.v1 --layer-type=TENANT

Output:

kind: occasion.enriched.v1
description: Indicates that an occasion was enriched with topology tags
dataschema: contracts:jsonSchema/platform:occasion.v1
class: knowledge:set off
extensions:
  - contracts:cloudeventExtension/platform:entitytypes
  - contracts:cloudeventExtension/platform:supply


Subscribe to the occasion

To subscribe to this occasion, you could add an occasion definition and occasion state referencing this definition (word a nature of the reference to the occasion – it have to be certified with its data kind):

occasions:
  - identify: EventAcquired
    kind: contracts:cloudevent/platform:occasion.enriched.v1
    variety: consumed
    knowledgeOnly: false
    supply: platform
states:
  - identify: event-received
    kind: occasion
    onEvents:
      - eventRefs:
          - EventAcquired

Inspect the occasion

Since the information in workflows is acquired in JSON format, occasion knowledge is described in JSON schema.

Let’s take a look at the JSON schema of this occasion (referenced in dataschema), so you realize what to anticipate in our workflow:

fsoc data get --type=contracts:jsonSchema --object-id=platform:occasion.v1 --layer-type=TENANT

Result:
$schema: http://json-schema.org/draft-07/schema#
title: Event
$id: occasion.v1
kind: object
required:
  - entities
  - kind
  - timestamp
properties:
  entities:
    kind: array
    minItems: 1
    objects:
      $ref: '#/definitions/EntityReference'
  kind:
    $ref: '#/definitions/TypeReference'
  timestamp:
    kind: integer
    description: The timestamp in milliseconds
  spanId:
    kind: string
    description: Span id
  traceId:
    kind: string
    description: Trace id
  uncooked:
    kind: string
    description: The uncooked physique of the occasion document
  attributes:
    $ref: '#/definitions/Attributes'
  tags:
    $ref: '#/definitions/Tags'
additionalProperties: false
definitions:
  Tags:
    kind: object
    propertyNames:
      minLength: 1
      maxLength: 256
    additionalProperties:
      kind: string
  Attributes:
    kind: object
    propertyNames:
      minLength: 1
      maxLength: 256
    additionalProperties:
      kind:
        - string
        - quantity
        - boolean
        - object
        - array
  EntityReference:
    kind: object
    required:
      - id
      - kind
    properties:
      id:
        kind: string
      kind:
        $ref: '#/definitions/TypeReference'
      additionalProperties: false
  TypeReference:
    kind: string
    description: A completely certified FMM kind reference
    instance: k8s:pod

It’s simple – a single occasion, with a number of entity references. Since knowledgeOnly=false, the payload of the occasion will probably be enclosed within the knowledge area, and extension attributes will even be accessible to the workflow.
Since we all know the precise FMM occasion kind we’re curious about, you may as well question its definition to grasp the attributes that the workflow will probably be receiving and their semantics:

fsoc data get --type=fmm:occasion --filter="knowledge.identify eq "healthrule.violation" and knowledge.namespace.identify eq "alerting"" --layer-type=TENANT

Validate occasion relevance

You’ll want to make sure that the occasion you obtain is of the right FMM occasion kind, and that referenced entities are related. To do that, you may write an expression in JSONata after which use it in an motion situation:

capabilities:
  - identify: checkType
    kind: expression
    operation: |-
      knowledge.kind="alerting:healthrule.violation" and (
          'k8s:deployment' in knowledge.entities.kind or
          'k8s:statefulset' in knowledge.entities.kind or
          'k8s:daemonset' in knowledge.entities.kind or
          'k8s:cronjob' in knowledge.entities.kind or
          'k8s:managed_job' in knowledge.entities.kind or
          'apm:service' in knowledge.entities.kind
      )
 states:
  - identify: event-received
    kind: occasion
    onEvents:
      - eventRefs:
          - EventAcquired
        actions:
          - identify: createMeasurement
            situation: ${ fn:checkType }

Create and publish an occasion

Let’s discover the measurement statement occasion that you could publish:

fsoc data get --type=contracts:cloudevent --object-id=platform:measurement.acquired.v1 --layer-type=TENANT

Output:

kind: measurement.acquired.v1
description: Indicates that measurements have been acquired. Measurements are then aggregated right into a metric.
dataschema: contracts:jsonSchema/platform:measurement.v1
class: knowledge:statement
extensions:
  - contracts:cloudeventExtension/platform:supply

Now let’s take a look at the measurement schema so you understand how to provide a measurement occasion:

fsoc data get --type=contracts:jsonSchema --object-id=platform:measurement.v1 --layer-type=TENANT

Output:

$schema: http://json-schema.org/draft-07/schema#
title: Measurements for a selected metric
$id: measurement.v1
kind: object
required:
  - entity
  - kind
  - measurements
properties:
  entity:
    $ref: '#/definitions/EntityReference'
  kind:
    $ref: '#/definitions/TypeReference'
  attributes:
    $ref: '#/definitions/Attributes'
  measurements:
    kind: array
    minItems: 1
    description: Measurement values with timestamp for use for metric computation
    objects:
      kind: object
      required:
        - timestamp
      oneOf:
        - required:
            - intValue
        - required:
            - doubleValue
      properties:
        timestamp:
          kind: integer
          description: The timestamp in milliseconds
        intValue:
          kind: integer
          description: Long worth for use for metric computation.
        doubleValue:
          kind: quantity
          description: Double Measurement worth for use for metric computation.
      additionalProperties: false
additionalProperties: false
definitions:
  Attributes:
    kind: object
    propertyNames:
      minLength: 1
      maxLength: 256
    additionalProperties:
      kind:
        - string
        - quantity
        - boolean
  EntityReference:
    kind: object
    required:
      - id
      - kind
    properties:
      id:
        kind: string
      kind:
        $ref: '#/definitions/TypeReference'
      additionalProperties: false
  TypeReference:
    kind: string
    description: A completely certified FMM kind identify
    instance: k8s:pod

Create a measurement

Let’s create one other expression that takes the enter occasion and generates a measurement as per the above schema, and use it in an motion within the occasion state:

capabilities:
  ...
  - identify: createMeasurement
    kind: expression
    operation: |-
      {
          'entity': knowledge.entities[0],
          'kind': 'sampleworkflow:healthrule.violation.depend',
          'attributes': {
              'violation_severity': knowledge.attributes.violation_severity
          },
          'measurements': [
              {
                  'timestamp': data.timestamp,
                  'intValue': 
$exists(data.attributes.'event_details.condition_details.violation_count')? data.attributes.'event_details.condition_details.violation_count': 1
              }
          ]
      }
states:
  - identify: event-received
    kind: occasion
    onEvents:
      - eventRefs:
          - EventAcquired
        actions:
          - identify: createMeasurement
            situation: '${ fn:checkType }'
            functionRef: createMeasurement
            motionDataFilter:
              toStateData: '${ measurement }'

Here we’re preserving the violation_severity attribute from the unique occasion and associating the measurement with the identical entity.

The state execution will end in a measurement area created by createMeasurement motion, however provided that the occasion was fascinating based mostly on the situation.

Note that since we’re utilizing a brand new FMM metric kind – sampleworkflow:healthrule.violation.depend – we have to register it through the extension on the goal entity sorts. See full resolution linked beneath for particulars.

Publish an occasion

The subsequent step is to examine if the measurement was certainly created, and produce an occasion if it was. To do this, we’ll use a change state:

states:
  - identify: event-received
    kind: occasion
    onEvents:
      - eventRefs:
          - EventAcquired
        actions:
          - identify: createMeasurement
            situation: ${ fn:checkType }
            functionRef:
              refName: createMeasurement
            motionDataFilter:
              toStateData: ${ measurement }
    transition: check-measurement
  - identify: check-measurement
    kind: change
    dataConditions:
      - situation: ${ measurement != null }
        finish:
          terminate: true
          produceEvents:
            - eventRef: CreateMeasurement
              knowledge: ${ measurement }
    defaultCondition:
      finish: true

That’s it! You can package deal your workflow in an answer, push your resolution, subscribe to it, and look at the metrics by navigating to the metric explorer at https://<your tenant>.observe.appdynamics.com/discover/cco/metric-explorer 

An instance graph sliced by violation_severity

Data observability

In conclusion, the extensibility of the Cisco Observability Platform empowers builders to tailor knowledge processing to their particular necessities effectively. Whether it’s customizing processing logic or implementing advanced workflows, the platform supplies the required instruments and suppleness.

Ready to study extra? Visit examples repo to discover additional and begin customizing your knowledge processing workflows right now.

Share:

LEAVE A REPLY

Please enter your comment!
Please enter your name here