How-To Setup Distributed Monitoring
Use Case
Setting up a distributed monitoring configuration between a branch location and the central location of our example company.
Example Setup
The instructions are based on the example setup as described below.
Satellite SONARPLEX
This is a branch location in cologne which is managed remotely.
Location | Cologne |
---|---|
IP-Address | 172.16.0.100 |
Central SONARPLEX
This is the network operation center where all alerts are collected form the remote locations.
Location | London |
---|---|
IP-Address | 172.16.0.254 |
azeti Agent Port | 4192 |
azeti Agent Password | How-ToDM |
The satellite device should send its status every 30 minutes and only HARD events should be sent.
The general procedure is as follows.
- Configure the Agent on the Central SONARPLEX
- Set the location identifier on the Satellite SONARPLEX
- Enable status and event delivery on the satellite
- Done.
Step-by-step guide
Central SONARPLEX configuration
First set the azeti Agent credentials.
- Open the Administration Web Interface > Configuration > Network > Agent Configuration
- Set Port to use to 4192
- Set Agent Password to How-ToDM (or any other)
- Click to save the configuration
Satellite SONARPLEX configuration
Now open the administrative web interface of the Satellite SONARPLEX and set the location identifier.
- Open the Administration Web Interface > Configuration > My Properties and scroll down to the bottom
- Open the resource LOCATION
- Set Ressource Value: Cologne
- Click to save the configuration
Now enable the status and event delivery on the Satellite SONARPLEX.
- Open the Administration Web Interface > Configuration > Status Delivery Configuration for Distributed Monitoring and configure the settings like shown below.
Parameter Name | Setting |
---|---|
Deliver Status at regular base | enable |
Delivery Interval in Minutes (Heartbeat) | 30 |
Retry Frequency in Minutes | 5 |
Retry Attempts | 3 |
Deliver following Events immediately | Service Alerts Host Alerts Suppress SOFT State alerts |
Deliver Performance Data | enable |
Do not store performance data locally | disable |
Destination Host (NOC) | 172.16.0.254 |
Destination Agent Port | 4192 |
Destination Agent Password | HowTo-DM |
Default service output upon transfer problems | Missing check result from satellite. |
Default service state upon transfer problems | UNKNOWN |
Retain state on outdated services | enable |
Debug Level | WARN |
See the Status Delivery Configuration for Distributed Monitoring article for a detailed explanation of the particular settings.
Verifying the Setup
All the satellite objects will appear on the destination device after a couple of minutes. The new objects will be prefixed by the location identifier (see above). All incoming status information is submitted passively. You should message similar like the shown excerpt.
[2014-02-04 17:11:06] PASSIVE SERVICE CHECK: dm-test_-azeti-A-;dm-test_Software-Status;0;OK - FM-Usage: 0.22% - LOAD: 0.33 [2014-02-04 17:11:06] PASSIVE SERVICE CHECK: dm-test_-azeti-A-;dm-test_Monitoring;0;OK - 13 processes running, last status update 2 seconds ago [2014-02-04 17:11:06] PASSIVE SERVICE CHECK: dm-test_-azeti-A-;dm-test_Watchdog-Status;0;OK - No problems within last recent hour
You can see the internal process of the distributed monitoring delivery right after it was enabled within the log files.
- Distributed Monitoring (NOC Processor) [process_satellite.log]
- Distributed Monitoring (Satellite Processor) [send_status.log]
- Distributed Monitoring Event Log (NOC) [event.log]
Please keep in mind that only following service configuration objects will be synchronized in a distributed monitoring setup:
- Notification timeperiod (if a timeperiod of the same name exists on NOC; use 24x7 otherwise)
- Send notifications when service is in state
All other service objects (like Notification frequency) will be generated out of default settings and can be configured differently on the NOC at any time.
- style