HiveMQ Data Governance Hub Quick Start Guide

This quick start guide accompanies the Early Access Preview (EAP) version of the HiveMQ Data Governance Hub. Starting with HiveMQ 4.17, a free version of the EAP is included in your HiveMQ Platform Bundle along with a full-featured five-hour trial mode. To learn more about the different ways you can explore our new EAP, see HiveMQ Data Governance Hub Licensing.

The HiveMQ Data Governance Hub provides mechanisms to define how MQTT data is handled in the HiveMQ broker.

This guide gives you step-by-step instructions to do the following:

  • Install and configure the EAP version of the HiveMQ Data Governance Hub.

  • Add schemas to define the expected structure and format of incoming MQTT message payload data.

  • Add policies to tell your HiveMQ broker how you want your incoming MQTT messages to be handled.

  • Test your setup with valid and invalid incoming MQTT message payload data.

Requirements

Installation

  1. Download and install HiveMQ.

  2. Place your valid licence file (.plic) for the HiveMQ Data Governance Hub EAP in the license folder of your HiveMQ installation.
    (Skip this step if you are using the Data Governance Hub in Free Mode or Trial Mode).

Configuration

By default, the HiveMQ Data Governance Hub data validation feature is disabled (<enabled>false</enabled>).

To enable data validation, set the <enabled> option in the <data-governance> section of your HiveMQ configuration to true.

To support data validation, HiveMQ REST API version 4.15 adds new Policies and Schemas endpoints. For more information, see HiveMQ REST API.

To enable the HiveMQ REST API, set the <enabled> option in the <rest-api> section of your HiveMQ configuration to true.

Example to enable the HiveMQ Data Governance Hub and HiveMQ REST API
<?xml version="1.0"?>
<hivemq>
    ...
    <!-- Enable the data validation feature -->
    <data-governance-hub>
        <data-validation>
         <enabled>true</enabled>
        </data-validation>
    </data-governance-hub>

    <!-- Enable the REST-API, so that we can create new schemas and policies by using it -->
    <rest-api>
        <enabled>true</enabled>
    </rest-api>
    ...
</hivemq>

Add Schemas

The HiveMQ Data Governance Hub supports JSON Schemas and Protobuf.

Add a JSON Schema

  1. Create a JSON Schema called coordinates.json:

    coordinates.json
    {
      "$id": "https://example.com/geographical-location.schema.json",
      "$schema": "https://json-schema.org/draft/2020-12/schema",
      "title": "Longitude and Latitude Values",
      "description": "A geographical coordinate.",
      "required": [
        "latitude",
        "longitude"
      ],
      "type": "object",
      "properties": {
        "latitude": {
          "type": "number",
          "minimum": -90,
          "maximum": 90
        },
        "longitude": {
          "type": "number",
          "minimum": -180,
          "maximum": 180
        }
      }
    }
  2. Encode the JSON Schema file in base64 (encoding is required):

    cat coordinates.json | base64 > coordinates_base64.txt
    Resulting coordinates_base64.txt
    ewogICIkaWQiOiAiaHR0cHM6Ly9leGFtcGxlLmNvbS9nZW9ncmFwaGljYWwtbG9jYXRpb24uc2NoZW1hLmpzb24iLAogICIkc2NoZW1hIjogImh0dHBzOi8vanNvbi1zY2hlbWEub3JnL2RyYWZ0LzIwMjAtMTIvc2NoZW1hIiwKICAidGl0bGUiOiAiTG9uZ2l0dWRlIGFuZCBMYXRpdHVkZSBWYWx1ZXMiLAogICJkZXNjcmlwdGlvbiI6ICJBIGdlb2dyYXBoaWNhbCBjb29yZGluYXRlLiIsCiAgInJlcXVpcmVkIjogWwogICAgImxhdGl0dWRlIiwKICAgICJsb25naXR1ZGUiCiAgXSwKICAidHlwZSI6ICJvYmplY3QiLAogICJwcm9wZXJ0aWVzIjogewogICAgImxhdGl0dWRlIjogewogICAgICAidHlwZSI6ICJudW1iZXIiLAogICAgICAibWluaW11bSI6IC05MCwKICAgICAgIm1heGltdW0iOiA5MAogICAgfSwKICAgICJsb25naXR1ZGUiOiB7CiAgICAgICJ0eXBlIjogIm51bWJlciIsCiAgICAgICJtaW5pbXVtIjogLTE4MCwKICAgICAgIm1heGltdW0iOiAxODAKICAgIH0KICB9Cn0K
  3. Define the body of the create schema POST request that you need to upload the schema to your HiveMQ broker in a file called coordinates-body.json:

    Body for the create schema POST request
    {
      "id" : "gps_coordinates",
      "type" : "JSON",
      "schemaDefinition" : "ewogICIkaWQiOiAiaHR0cHM6Ly9leGFtcGxlLmNvbS9nZW9ncmFwaGljYWwtbG9jYXRpb24uc2NoZW1hLmpzb24iLAogICIkc2NoZW1hIjogImh0dHBzOi8vanNvbi1zY2hlbWEub3JnL2RyYWZ0LzIwMjAtMTIvc2NoZW1hIiwKICAidGl0bGUiOiAiTG9uZ2l0dWRlIGFuZCBMYXRpdHVkZSBWYWx1ZXMiLAogICJkZXNjcmlwdGlvbiI6ICJBIGdlb2dyYXBoaWNhbCBjb29yZGluYXRlLiIsCiAgInJlcXVpcmVkIjogWwogICAgImxhdGl0dWRlIiwKICAgICJsb25naXR1ZGUiCiAgXSwKICAidHlwZSI6ICJvYmplY3QiLAogICJwcm9wZXJ0aWVzIjogewogICAgImxhdGl0dWRlIjogewogICAgICAidHlwZSI6ICJudW1iZXIiLAogICAgICAibWluaW11bSI6IC05MCwKICAgICAgIm1heGltdW0iOiA5MAogICAgfSwKICAgICJsb25naXR1ZGUiOiB7CiAgICAgICJ0eXBlIjogIm51bWJlciIsCiAgICAgICJtaW5pbXVtIjogLTE4MCwKICAgICAgIm1heGltdW0iOiAxODAKICAgIH0KICB9Cn0K"
    }
    1. The id can be any identifier, you will need this as the schema’s reference in a policy.

    2. The type must be PROTOBUF or JSON.

    3. The schemaDefinition is the content of coordinates_base64.txt.

  4. To upload the coordinates-body.json schema to your broker, run the following command:

    curl  -H "Content-Type: application/json" -d "@coordinates-body.json" -X POST http://localhost:8888/api/v1/data-validation/schemas
    This example assumes that your HIVEMQ REST API runs at http://localhost:8888.

Add a Protobuf Schema

  1. Create a Protobuf schema called coordinates.proto:

    coordinates.proto
    syntax = "proto3";
    message GpsCoordinates {
      int32 longitude = 1;
      int32 latitude = 2;
    }
  2. Compile coordinates.proto to the descriptor set coordinates.desc:

    protoc --descriptor_set_out=coordinates.desc coordinates.proto
  3. Encode the descriptor file in base64 (encoding is required):

    cat coordinates.desc | base64 > coordinates_base64.txt
    Resulting coordinates_base64.txt
    CmcKEWNvb3JkaW5hdGVzLnByb3RvIkoKDkdwc0Nvb3JkaW5hdGVzEhwKCWxvbmdpdHVkZRgBIAEoBVIJbG9uZ2l0dWRlEhoKCGxhdGl0dWRlGAIgASgFUghsYXRpdHVkZWIGcHJvdG8z
  4. Craft the body of the create schema POST request to upload the schema called coordinates-body.json:

    Body for the post request
    {
      "id" : "gps_coordinates",
      "type" : "PROTOBUF",
      "schemaDefinition" : "CmcKEWNvb3JkaW5hdGVzLnByb3RvIkoKDkdwc0Nvb3JkaW5hdGVzEhwKCWxvbmdpdHVkZRgBIAEoBVIJbG9uZ2l0dWRlEhoKCGxhdGl0dWRlGAIgASgFUghsYXRpdHVkZWIGcHJvdG8z",
      "arguments" : {
        "messageType" : "GpsCoordinates"
      }
    }
    1. The id can be any identifier, you will need this as the schema’s reference in a policy.

    2. The type must be PROTOBUF or JSON.

    3. The schemaDefinition is the content of coordinates_base64.txt.

    4. The messageType argument must be the name of the protobuf message from Step 1.

  5. Upload it via

    curl  -H "Content-Type: application/json" -d "@coordinates-body.json" -X POST http://localhost:8888/api/v1/data-validation/schemas

Add Policies

The policies you create tell your HiveMQ broker how you want your incoming MQTT messages to be handled.

Starting with HiveMQ version 4.17, a free version of the Data Governance Hub is included in your HiveMQ Platform bundle. In free mode, you can create one policy and use basic functionality. To create additional policies and explore all the capabilities of the HiveMQ Data Governance Hub, activate a 5-hour full trial or contact our sales team for licensing information.

Add a Policy

  1. Create a policy in the file coordinates-policy.json

    {
      "id": "com.hivemq.policy.coordinates",
      "matching": {
        "topicFilter": "coordinates/+"
      },
      "validation": {
        "validators": [
          {
            "type": "schema",
            "arguments": {
              "strategy": "ALL_OF",
              "schemas": [
                {
                  "schemaId": "gps_coordinates",
                  "version": "latest"
                }
              ]
            }
          }
        ]
      },
      "onSuccess": {
        "pipeline": [
          {
            "id": "logSuccess",
            "functionId": "System.log",
            "arguments": {
              "level": "INFO",
              "message": "${clientId} sent a valid publish on topic '${topic}' with result '${validationResult}'"
            }
          }
        ]
      },
      "onFailure": {
        "pipeline": [
          {
            "id": "logFailure",
            "functionId": "System.log",
            "arguments": {
              "level": "WARN",
              "message": "${clientId} sent an invalid publish on topic '${topic}' with result '${validationResult}'"
            }
          }
        ]
      }
    }
  2. Upload the policy to your HiveMQ broker with the following command (you must upload the referenced schema before you upload the policy):

    curl  -H "Content-Type: application/json" -d "@coordinates-policy.json" -X POST http://localhost:8888/api/v1/data-validation/policies

    The policy is now applied and all incoming MQTT messages are subject to validation.

Test Your Data Validation Setup

Send an MQTT message with a valid payload

mqtt pub -h localhost -p 1883 -t coordinates/test -m "{\"longitude\":12, \"latitude\":42 }"
Example log output
10:27:33.449 INFO - hmq_GAht0_1_08087d5b1233e0bae4b6b10bfda40cc7 sent a valid publish on topic 'coordinates/test' with result 'ValidationResults:[{schemaId=gps_coordinates, success=true}]'

Send an MQTT message with an invalid payload

mqtt pub -h localhost -p 1883 -t coordinates/test -m "{\"longitude\":12, \"latitude\":1337 }"
Example log output
10:32:06.237 WARN - hmq_skuhj_1_04c848009887ef38f86c729269faae16 sent an invalid publish on topic 'coordinates/test' with result 'ValidationResults:[{schemaId=gps_coordinates, success=false, errors=[$.latitude: must have a maximum value of 90]}]'

Delete a Policy

To delete the policy you just created, run the following command:

curl -X DELETE http://localhost:8888/api/v1/data-validation/policies/com.hivemq.policy.coordinates

Learn More

Once you set up your first schemas and policies, you are ready to dive deeper into all the capabilities the HiveMQ Data Governance Hub offers. To learn more, browse our full user documentation on Schemas and Policies or explore a collection of practical use cases in our HiveMQ Policy Cookbooks.