mirror of
https://github.com/drewkerrigan/nagios-http-json.git
synced 2025-05-10 09:53:44 +02:00
Compare commits
19 Commits
Author | SHA1 | Date | |
---|---|---|---|
|
ccf05d469a | ||
|
115acc06fd | ||
|
164632faa5 | ||
|
039cb0adb6 | ||
|
4e0d4e873b | ||
|
186f081cd7 | ||
|
e15f0f01ed | ||
|
b61789e4a4 | ||
|
c6daa09ba2 | ||
|
3a1e7d90d0 | ||
|
afb2ef7b88 | ||
|
2a6d88bc39 | ||
|
2dbb38512f | ||
|
9ff11308be | ||
|
c634ae8bb5 | ||
|
d3a2f3ed9e | ||
|
9d344f5a7a | ||
|
5c4a955abd | ||
|
b920a65afd |
9
.github/workflows/unittest.yml
vendored
9
.github/workflows/unittest.yml
vendored
@ -1,6 +1,11 @@
|
||||
name: CI
|
||||
|
||||
on: [push, pull_request]
|
||||
on:
|
||||
push:
|
||||
branches: [main, master]
|
||||
tags:
|
||||
- v*
|
||||
pull_request:
|
||||
|
||||
jobs:
|
||||
gitHubActionForPytest:
|
||||
@ -11,7 +16,7 @@ jobs:
|
||||
name: GitHub Action
|
||||
steps:
|
||||
- name: Checkout
|
||||
uses: actions/checkout@v2
|
||||
uses: actions/checkout@v4
|
||||
- name: Install dependencies
|
||||
run: |
|
||||
python -m pip install -r requirements-dev.txt
|
||||
|
166
README.md
166
README.md
@ -4,15 +4,50 @@
|
||||
|
||||
This is a generic plugin for Nagios which checks json values from a given HTTP endpoint against argument specified rules and determines the status and performance data for that service.
|
||||
|
||||
## Links
|
||||
## Installation
|
||||
|
||||
* [CLI Usage](#cli-usage)
|
||||
* [Examples](#examples)
|
||||
* [Riak Stats](docs/RIAK.md)
|
||||
* [Docker](docs/DOCKER.md)
|
||||
* [Nagios Installation](#nagios-installation)
|
||||
Requirements:
|
||||
|
||||
## CLI Usage
|
||||
* Python 3.6+
|
||||
|
||||
### Nagios
|
||||
|
||||
Assuming a standard installation of Nagios, the plugin can be executed from the machine that Nagios is running on.
|
||||
|
||||
```bash
|
||||
cp check_http_json.py /usr/local/nagios/libexec/plugins/check_http_json.py
|
||||
chmod +x /usr/local/nagios/libexec/plugins/check_http_json.py
|
||||
```
|
||||
|
||||
Add the following service definition to your server config (`localhost.cfg`):
|
||||
|
||||
```
|
||||
|
||||
define service {
|
||||
use local-service
|
||||
host_name localhost
|
||||
service_description <command_description>
|
||||
check_command <command_name>
|
||||
}
|
||||
|
||||
```
|
||||
|
||||
Add the following command definition to your commands config (`commands.config`):
|
||||
|
||||
```
|
||||
|
||||
define command{
|
||||
command_name <command_name>
|
||||
command_line /usr/bin/python /usr/local/nagios/libexec/plugins/check_http_json.py -H <host>:<port> -p <path> [-e|-q|-w|-c <rules>] [-m <metrics>]
|
||||
}
|
||||
|
||||
```
|
||||
|
||||
### Icinga2
|
||||
|
||||
An example Icinga2 command definition can be found here: (`contrib/icinga2_check_command_definition.conf`)
|
||||
|
||||
## Usage
|
||||
|
||||
Executing `./check_http_json.py -h` will yield the following details:
|
||||
|
||||
@ -58,7 +93,9 @@ options:
|
||||
-t TIMEOUT, --timeout TIMEOUT
|
||||
Connection timeout (seconds)
|
||||
--unreachable-state UNREACHABLE_STATE
|
||||
Exit with specified code if URL unreachable. Examples: 1 for Warning, 2 for Critical, 3 for Unknown (default: 3)
|
||||
Exit with specified code when the URL is unreachable. Examples: 1 for Warning, 2 for Critical, 3 for Unknown (default: 3)
|
||||
--invalid-json-state INVALID_JSON_STATE
|
||||
Exit with specified code when no valid JSON is returned. Examples: 1 for Warning, 2 for Critical, 3 for Unknown (default: 3)
|
||||
-B AUTH, --basic-auth AUTH
|
||||
Basic auth string "username:password"
|
||||
-D DATA, --data DATA The http payload to send as a POST
|
||||
@ -83,6 +120,18 @@ options:
|
||||
can be delimited with colon (key,value1:value2). Return warning if equality check fails
|
||||
-Q [KEY_VALUE_LIST_CRITICAL ...], --key_equals_critical [KEY_VALUE_LIST_CRITICAL ...]
|
||||
Same as -q but return critical if equality check fails.
|
||||
--key_time [KEY_TIME_LIST ...],
|
||||
Checks a Timestamp of these keys and values
|
||||
(key[>alias],value key2,value2) to determine status.
|
||||
Multiple key values can be delimited with colon
|
||||
(key,value1:value2). Return warning if the key is older
|
||||
than the value (ex.: 30s,10m,2h,3d,...).
|
||||
With at it return warning if the key is jounger
|
||||
than the value (ex.: @30s,@10m,@2h,@3d,...).
|
||||
With Minus you can shift the time in the future.
|
||||
--key_time_critical [KEY_TIME_LIST_CRITICAL ...],
|
||||
Same as --key_time but return critical if
|
||||
Timestamp age fails.
|
||||
-u [KEY_VALUE_LIST_UNKNOWN ...], --key_equals_unknown [KEY_VALUE_LIST_UNKNOWN ...]
|
||||
Same as -q but return unknown if equality check fails.
|
||||
-y [KEY_VALUE_LIST_NOT ...], --key_not_equals [KEY_VALUE_LIST_NOT ...]
|
||||
@ -97,6 +146,8 @@ options:
|
||||
(key[>alias],UnitOfMeasure), (key[>alias],UnitOfMeasure,WarnRange, CriticalRange).
|
||||
```
|
||||
|
||||
The check plugin respects the environment variables `HTTP_PROXY`, `HTTPS_PROXY`.
|
||||
|
||||
## Examples
|
||||
|
||||
### Key Naming
|
||||
@ -160,6 +211,22 @@ options:
|
||||
]
|
||||
}
|
||||
|
||||
|
||||
**Data for multiple keys for an object** `-q capacity1.value,True capacity2.value,True capacity3.value,True`
|
||||
|
||||
{
|
||||
"capacity1": {
|
||||
"value": true
|
||||
},
|
||||
"capacity2": {
|
||||
"value": true
|
||||
},
|
||||
"capacity3": {
|
||||
"value": true
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
### Thresholds and Ranges
|
||||
|
||||
**Data**:
|
||||
@ -188,54 +255,49 @@ options:
|
||||
|
||||
More info about Nagios Range format and Units of Measure can be found at [https://nagios-plugins.org/doc/guidelines.html](https://nagios-plugins.org/doc/guidelines.html).
|
||||
|
||||
### Timestamp
|
||||
|
||||
**Data**:
|
||||
|
||||
{ "metric": "2020-01-01 10:10:00.000000+00:00" }
|
||||
|
||||
#### Relevant Commands
|
||||
|
||||
* **Warning:** `./check_http_json.py -H <host>:<port> -p <path> --key_time "metric,TIME"`
|
||||
* **Critical:** `./check_http_json.py -H <host>:<port> -p <path> --key_time_critical "metric,TIME"`
|
||||
|
||||
#### TIME Definitions
|
||||
|
||||
* **Format:** [@][-]TIME
|
||||
* **Generates a Warning or Critical if...**
|
||||
* **Timestamp is more than 30 seconds in the past:** `30s`
|
||||
* **Timestamp is more than 5 minutes in the past:** `5m`
|
||||
* **Timestamp is more than 12 hours in the past:** `12h`
|
||||
* **Timestamp is more than 2 days in the past:** `2d`
|
||||
* **Timestamp is more than 30 minutes in the future:** `-30m`
|
||||
* **Timestamp is not more than 30 minutes in the future:** `@-30m`
|
||||
* **Timestamp is not more than 30 minutes in the past:** `@30m`
|
||||
|
||||
##### Timestamp Format
|
||||
|
||||
This plugin uses the Python function 'datetime.fromisoformat'.
|
||||
Since Python 3.11 any valid ISO 8601 format is supported, with the following exceptions:
|
||||
|
||||
* Time zone offsets may have fractional seconds.
|
||||
* The T separator may be replaced by any single unicode character.
|
||||
* Fractional hours and minutes are not supported.
|
||||
* Reduced precision dates are not currently supported (YYYY-MM, YYYY).
|
||||
* Extended date representations are not currently supported (±YYYYYY-MM-DD).
|
||||
* Ordinal dates are not currently supported (YYYY-OOO).
|
||||
|
||||
Before Python 3.11, this method only supported the format YYYY-MM-DD
|
||||
|
||||
More info and examples the about Timestamp Format can be found at [https://docs.python.org/3/library/datetime.html#datetime.datetime.fromisoformat](https://docs.python.org/3/library/datetime.html#datetime.datetime.fromisoformat).
|
||||
|
||||
#### Using Headers
|
||||
|
||||
* `./check_http_json.py -H <host>:<port> -p <path> -A '{"content-type": "application/json"}' -w "metric,RANGE"`
|
||||
|
||||
## Nagios Installation
|
||||
|
||||
### Requirements
|
||||
|
||||
* Python 3.6+
|
||||
|
||||
### Configuration
|
||||
|
||||
Assuming a standard installation of Nagios, the plugin can be executed from the machine that Nagios is running on.
|
||||
|
||||
```bash
|
||||
cp check_http_json.py /usr/local/nagios/libexec/plugins/check_http_json.py
|
||||
chmod +x /usr/local/nagios/libexec/plugins/check_http_json.py
|
||||
```
|
||||
|
||||
Add the following service definition to your server config (`localhost.cfg`):
|
||||
|
||||
```
|
||||
|
||||
define service {
|
||||
use local-service
|
||||
host_name localhost
|
||||
service_description <command_description>
|
||||
check_command <command_name>
|
||||
}
|
||||
|
||||
```
|
||||
|
||||
Add the following command definition to your commands config (`commands.config`):
|
||||
|
||||
```
|
||||
|
||||
define command{
|
||||
command_name <command_name>
|
||||
command_line /usr/bin/python /usr/local/nagios/libexec/plugins/check_http_json.py -H <host>:<port> -p <path> [-e|-q|-w|-c <rules>] [-m <metrics>]
|
||||
}
|
||||
|
||||
```
|
||||
|
||||
## Icinga2 configuration
|
||||
|
||||
The Icinga2 command definition can be found here: (contrib/icinga2_check_command_definition.conf)
|
||||
|
||||
|
||||
## License
|
||||
|
||||
Copyright 2014-2015 Drew Kerrigan.
|
||||
|
@ -6,8 +6,10 @@ import json
|
||||
import argparse
|
||||
import sys
|
||||
import ssl
|
||||
import traceback
|
||||
from urllib.error import HTTPError
|
||||
from urllib.error import URLError
|
||||
from datetime import datetime, timedelta, timezone
|
||||
|
||||
plugin_description = \
|
||||
"""
|
||||
@ -23,8 +25,8 @@ WARNING_CODE = 1
|
||||
CRITICAL_CODE = 2
|
||||
UNKNOWN_CODE = 3
|
||||
|
||||
__version__ = '2.2.0'
|
||||
__version_date__ = '2024-05-14'
|
||||
__version__ = '2.3.0'
|
||||
__version_date__ = '2025-04-11'
|
||||
|
||||
class NagiosHelper:
|
||||
"""
|
||||
@ -234,11 +236,13 @@ class JsonRuleProcessor:
|
||||
self.key_value_list = self.expandKeys(self.rules.key_value_list)
|
||||
self.key_value_list_not = self.expandKeys(
|
||||
self.rules.key_value_list_not)
|
||||
self.key_time_list = self.expandKeys(self.rules.key_time_list)
|
||||
self.key_list = self.expandKeys(self.rules.key_list)
|
||||
self.key_value_list_critical = self.expandKeys(
|
||||
self.rules.key_value_list_critical)
|
||||
self.key_value_list_not_critical = self.expandKeys(
|
||||
self.rules.key_value_list_not_critical)
|
||||
self.key_time_list_critical = self.expandKeys(self.rules.key_time_list_critical)
|
||||
self.key_list_critical = self.expandKeys(self.rules.key_list_critical)
|
||||
self.key_value_list_unknown = self.expandKeys(
|
||||
self.rules.key_value_list_unknown)
|
||||
@ -330,6 +334,72 @@ class JsonRuleProcessor:
|
||||
failure += self.checkThreshold(key, alias, r)
|
||||
return failure
|
||||
|
||||
def checkTimestamp(self, key, alias, r):
|
||||
failure = ''
|
||||
invert = False
|
||||
negative = False
|
||||
if r.startswith('@'):
|
||||
invert = True
|
||||
r = r[1:]
|
||||
if r.startswith('-'):
|
||||
negative = True
|
||||
r = r[1:]
|
||||
duration = int(r[:-1])
|
||||
unit = r[-1]
|
||||
|
||||
if unit == 's':
|
||||
tiemduration = timedelta(seconds=duration)
|
||||
elif unit == 'm':
|
||||
tiemduration = timedelta(minutes=duration)
|
||||
elif unit == 'h':
|
||||
tiemduration = timedelta(hours=duration)
|
||||
elif unit == 'd':
|
||||
tiemduration = timedelta(days=duration)
|
||||
else:
|
||||
return " Value (%s) is not a vaild timeduration." % (r)
|
||||
|
||||
if not self.helper.exists(key):
|
||||
return " Key (%s) for key %s not Exists." % \
|
||||
(key, alias)
|
||||
|
||||
try:
|
||||
timestamp = datetime.fromisoformat(self.helper.get(key))
|
||||
except ValueError as ve:
|
||||
return " Value (%s) for key %s is not a Date in ISO format. %s" % \
|
||||
(self.helper.get(key), alias, ve)
|
||||
|
||||
now = datetime.now(timezone.utc)
|
||||
|
||||
if timestamp.tzinfo is None:
|
||||
timestamp = timestamp.replace(tzinfo=timezone.utc)
|
||||
|
||||
age = now - timestamp
|
||||
|
||||
if not negative:
|
||||
if age > tiemduration and not invert:
|
||||
failure += " Value (%s) for key %s is older than now-%s%s." % \
|
||||
(self.helper.get(key), alias, duration, unit)
|
||||
if not age > tiemduration and invert:
|
||||
failure += " Value (%s) for key %s is newer than now-%s%s." % \
|
||||
(self.helper.get(key), alias, duration, unit)
|
||||
else:
|
||||
if age < -tiemduration and not invert:
|
||||
failure += " Value (%s) for key %s is newer than now+%s%s." % \
|
||||
(self.helper.get(key), alias, duration, unit)
|
||||
if not age < -tiemduration and invert:
|
||||
failure += " Value (%s) for key %s is older than now+%s%s.." % \
|
||||
(self.helper.get(key), alias, duration, unit)
|
||||
|
||||
return failure
|
||||
|
||||
def checkTimestamps(self, threshold_list):
|
||||
failure = ''
|
||||
for threshold in threshold_list:
|
||||
k, r = threshold.split(',')
|
||||
key, alias = _getKeyAlias(k)
|
||||
failure += self.checkTimestamp(key, alias, r)
|
||||
return failure
|
||||
|
||||
def checkWarning(self):
|
||||
failure = ''
|
||||
if self.key_threshold_warning is not None:
|
||||
@ -338,6 +408,8 @@ class JsonRuleProcessor:
|
||||
failure += self.checkEquality(self.key_value_list)
|
||||
if self.key_value_list_not is not None:
|
||||
failure += self.checkNonEquality(self.key_value_list_not)
|
||||
if self.key_time_list is not None:
|
||||
failure += self.checkTimestamps(self.key_time_list)
|
||||
if self.key_list is not None:
|
||||
failure += self.checkExists(self.key_list)
|
||||
return failure
|
||||
@ -352,6 +424,8 @@ class JsonRuleProcessor:
|
||||
failure += self.checkEquality(self.key_value_list_critical)
|
||||
if self.key_value_list_not_critical is not None:
|
||||
failure += self.checkNonEquality(self.key_value_list_not_critical)
|
||||
if self.key_time_list_critical is not None:
|
||||
failure += self.checkTimestamps(self.key_time_list_critical)
|
||||
if self.key_list_critical is not None:
|
||||
failure += self.checkExists(self.key_list_critical)
|
||||
return failure
|
||||
@ -446,7 +520,9 @@ def parseArgs(args):
|
||||
parser.add_argument('-t', '--timeout', type=int,
|
||||
help='Connection timeout (seconds)')
|
||||
parser.add_argument('--unreachable-state', type=int, default=3,
|
||||
help='Exit with specified code if URL unreachable. Examples: 1 for Warning, 2 for Critical, 3 for Unknown (default: 3)')
|
||||
help='Exit with specified code when the URL is unreachable. Examples: 1 for Warning, 2 for Critical, 3 for Unknown (default: 3)')
|
||||
parser.add_argument('--invalid-json-state', type=int, default=3,
|
||||
help='Exit with specified code when no valid JSON is returned. Examples: 1 for Warning, 2 for Critical, 3 for Unknown (default: 3)')
|
||||
parser.add_argument('-B', '--basic-auth', dest='auth',
|
||||
help='Basic auth string "username:password"')
|
||||
parser.add_argument('-D', '--data', dest='data',
|
||||
@ -490,6 +566,19 @@ def parseArgs(args):
|
||||
dest='key_value_list_critical', nargs='*',
|
||||
help='''Same as -q but return critical if
|
||||
equality check fails.''')
|
||||
parser.add_argument('--key_time', dest='key_time_list', nargs='*',
|
||||
help='''Checks a Timestamp of these keys and values
|
||||
(key[>alias],value key2,value2) to determine status.
|
||||
Multiple key values can be delimited with colon
|
||||
(key,value1:value2). Return warning if the key is older
|
||||
than the value (ex.: 30s,10m,2h,3d,...).
|
||||
With at it return warning if the key is jounger
|
||||
than the value (ex.: @30s,@10m,@2h,@3d,...).
|
||||
With Minus you can shift the time in the future.''')
|
||||
parser.add_argument('--key_time_critical',
|
||||
dest='key_time_list_critical', nargs='*',
|
||||
help='''Same as --key_time but return critical if
|
||||
Timestamp age fails.''')
|
||||
parser.add_argument('-u', '--key_equals_unknown',
|
||||
dest='key_value_list_unknown', nargs='*',
|
||||
help='''Same as -q but return unknown if
|
||||
@ -634,13 +723,15 @@ def main(cliargs):
|
||||
json_data = ''
|
||||
|
||||
try:
|
||||
# Requesting the data from the URL
|
||||
json_data = make_request(args, url, context)
|
||||
except HTTPError as e:
|
||||
# Try to recover from HTTP Error, if there is JSON in the response
|
||||
if "json" in e.info().get_content_subtype():
|
||||
json_data = e.read()
|
||||
else:
|
||||
nagios.append_message(UNKNOWN_CODE, " Could not find JSON in HTTP body. HTTPError[%s], url:%s" % (str(e.code), url))
|
||||
exit_code = args.invalid_json_state
|
||||
nagios.append_message(exit_code, " Could not find JSON in HTTP body. HTTPError[%s], url:%s" % (str(e.code), url))
|
||||
except URLError as e:
|
||||
# Some users might prefer another exit code if the URL wasn't reached
|
||||
exit_code = args.unreachable_state
|
||||
@ -650,23 +741,32 @@ def main(cliargs):
|
||||
sys.exit(nagios.getCode())
|
||||
|
||||
try:
|
||||
# Loading the JSON data from the request
|
||||
data = json.loads(json_data)
|
||||
except ValueError as e:
|
||||
nagios.append_message(UNKNOWN_CODE, " JSON Parser error: %s" % str(e))
|
||||
exit_code = args.invalid_json_state
|
||||
debugPrint(args.debug, traceback.format_exc())
|
||||
nagios.append_message(exit_code, " JSON Parser error: %s" % str(e))
|
||||
print(nagios.getMessage())
|
||||
sys.exit(nagios.getCode())
|
||||
else:
|
||||
verbosePrint(args.verbose, 1, json.dumps(data, indent=2))
|
||||
# Apply rules to returned JSON data
|
||||
|
||||
try:
|
||||
# Applying rules to returned JSON data
|
||||
processor = JsonRuleProcessor(data, args)
|
||||
nagios.append_message(WARNING_CODE, processor.checkWarning())
|
||||
nagios.append_message(CRITICAL_CODE, processor.checkCritical())
|
||||
nagios.append_metrics(processor.checkMetrics())
|
||||
nagios.append_message(UNKNOWN_CODE, processor.checkUnknown())
|
||||
except Exception as e: # pylint: disable=broad-exception-caught
|
||||
debugPrint(args.debug, traceback.format_exc())
|
||||
nagios.append_message(UNKNOWN_CODE, " Rule Parser error: %s" % str(e))
|
||||
|
||||
# Print Nagios specific string and exit appropriately
|
||||
print(nagios.getMessage())
|
||||
sys.exit(nagios.getCode())
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
# Program entry point
|
||||
main(sys.argv[1:])
|
||||
|
@ -1,4 +1,6 @@
|
||||
object CheckCommand "http_json" {
|
||||
// Example configuration for Icinga
|
||||
|
||||
import "plugin-check-command"
|
||||
|
||||
command = [ PluginDir + "/check_http_json.py" ]
|
||||
@ -53,6 +55,13 @@ object CheckCommand "http_json" {
|
||||
value = "$http_json_headers$"
|
||||
description = "additional http headers in JSON format to send with the request"
|
||||
}
|
||||
"--unreachable-state" = {
|
||||
value = "$http_json_unreachable_state$"
|
||||
description = "Exit with specified code when the URL is unreachable."
|
||||
}
|
||||
"--invalid-json-state" = {
|
||||
value = "$http_json_invalid_json_state$"
|
||||
description = "Exit with specified code when no valid JSON is returned."
|
||||
"--field_separator" = {
|
||||
value = "$http_json_field_separator$"
|
||||
description = "JSON Field separator, defaults to '.'; Select element in an array with '(' ')'"
|
||||
@ -64,42 +73,62 @@ object CheckCommand "http_json" {
|
||||
"--warning" = {
|
||||
value = "$http_json_warning$"
|
||||
description = "Warning threshold for these values, WarningRange is in the format [@]start:end"
|
||||
repeat_key = true
|
||||
}
|
||||
"--critical" = {
|
||||
value = "$http_json_critical$"
|
||||
description = "Critical threshold for these values, CriticalRange is in the format [@]start:end"
|
||||
repeat_key = true
|
||||
}
|
||||
"--key_exists" = {
|
||||
value = "$http_json_key_exists$"
|
||||
description = "Checks existence of these keys to determine status. Return warning if key is not present."
|
||||
repeat_key = true
|
||||
}
|
||||
"--key_exists_critical" = {
|
||||
value = "$http_json_key_exists_critical$"
|
||||
description = "Checks existence of these keys to determine status. Return critical if key is not present."
|
||||
repeat_key = true
|
||||
}
|
||||
"--key_equals" = {
|
||||
value = "$http_json_key_equals$"
|
||||
description = "Checks equality of these keys and values. Return warning if equality check fails"
|
||||
repeat_key = true
|
||||
}
|
||||
"--key_equals_critical" = {
|
||||
value = "$http_json_key_equals_critical$"
|
||||
description = "Checks equality of these keys and values. Return critical if equality check fails"
|
||||
repeat_key = true
|
||||
}
|
||||
"--key_equals_unknown" = {
|
||||
value = "$http_json_key_equals_unknown$"
|
||||
description = "Checks equality of these keys and values. Return unknown if equality check fails"
|
||||
repeat_key = true
|
||||
}
|
||||
"--key_not_equals" = {
|
||||
value = "$http_json_key_not_equals$"
|
||||
description = "Checks equality of these keys and values (key[>alias],value key2,value2) to determine status. Multiple key values can be delimited with colon (key,value1:value2). Return warning if equality check succeeds."
|
||||
repeat_key = true
|
||||
}
|
||||
"--key_not_equals_critical" = {
|
||||
value = "$http_json_key_not_equals_critical$"
|
||||
description = "Checks equality of these keys and values (key[>alias],value key2,value2) to determine status. Multiple key values can be delimited with colon (key,value1:value2). Return critical if equality check succeeds."
|
||||
repeat_key = true
|
||||
}
|
||||
"--key_metric" = {
|
||||
value = "$http_json_key_metric$"
|
||||
description = "Gathers the values of these keys"
|
||||
repeat_key = true
|
||||
}
|
||||
"--key_time" = {
|
||||
value = "$http_json_key_time$"
|
||||
description = " Checks a Timestamp of these keys and values (key[>alias],value key2,value2) to determine status."
|
||||
repeat_key = true
|
||||
}
|
||||
"--key_time_critical" = {
|
||||
value = "$http_json_key_time_critical$"
|
||||
description = "Same as --key_time but return critical if Timestamp age fails."
|
||||
repeat_key = true
|
||||
}
|
||||
}
|
||||
}
|
||||
|
@ -1,2 +1,2 @@
|
||||
coverage==6.5.0
|
||||
pylint==2.17.7
|
||||
coverage==7.8.0
|
||||
pylint==3.3.6
|
||||
|
@ -28,6 +28,8 @@ class RulesHelper:
|
||||
key_threshold_critical = None
|
||||
key_value_list_critical = None
|
||||
key_value_list_not_critical = None
|
||||
key_time_list = None
|
||||
key_time_list_critical = None
|
||||
key_value_list_unknown = None
|
||||
key_list_critical = None
|
||||
metric_list = None
|
||||
@ -72,6 +74,13 @@ class RulesHelper:
|
||||
self.key_threshold_critical = data
|
||||
return self
|
||||
|
||||
def dash_dash_key_time(self, data):
|
||||
self.key_time_list = data
|
||||
return self
|
||||
|
||||
def dash_dash_key_time_critical(self, data):
|
||||
self.key_time_list_critical = data
|
||||
return self
|
||||
|
||||
class UtilTest(unittest.TestCase):
|
||||
"""
|
||||
@ -302,3 +311,140 @@ class UtilTest(unittest.TestCase):
|
||||
# This should throw an error
|
||||
data = '[]'
|
||||
self.check_data(rules.dash_q(['(*).update_status,warn_me']), data, CRITICAL_CODE)
|
||||
|
||||
def test_key_time(self):
|
||||
if sys.version_info[1] >= 11:
|
||||
# Test current timestamp.
|
||||
now = datetime.now(timezone.utc)
|
||||
data = "{\"timestamp\": \"%s\",\"timestamp2\": \"%s\"}" % (now, now)
|
||||
|
||||
self.check_data(RulesHelper().dash_dash_key_time(['timestamp,30s', 'timestamp2,30s']), data, OK_CODE)
|
||||
|
||||
self.check_data(RulesHelper().dash_dash_key_time(['timestamp,30m']), data, OK_CODE)
|
||||
self.check_data(RulesHelper().dash_dash_key_time_critical(['timestamp,1h']), data, OK_CODE)
|
||||
self.check_data(RulesHelper().dash_dash_key_time(['timestamp,3h']), data, OK_CODE)
|
||||
self.check_data(RulesHelper().dash_dash_key_time_critical(['timestamp,2d']), data, OK_CODE)
|
||||
|
||||
self.check_data(RulesHelper().dash_dash_key_time(['timestamp,@30m']), data, WARNING_CODE)
|
||||
self.check_data(RulesHelper().dash_dash_key_time_critical(['timestamp,@1h']), data, CRITICAL_CODE)
|
||||
self.check_data(RulesHelper().dash_dash_key_time(['timestamp,@3h']), data, WARNING_CODE)
|
||||
self.check_data(RulesHelper().dash_dash_key_time_critical(['timestamp,@2d']), data, CRITICAL_CODE)
|
||||
|
||||
self.check_data(RulesHelper().dash_dash_key_time(['timestamp,-30m']), data, OK_CODE)
|
||||
self.check_data(RulesHelper().dash_dash_key_time_critical(['timestamp,-1h']), data, OK_CODE)
|
||||
self.check_data(RulesHelper().dash_dash_key_time(['timestamp,-3h']), data, OK_CODE)
|
||||
self.check_data(RulesHelper().dash_dash_key_time_critical(['timestamp,-2d']), data, OK_CODE)
|
||||
|
||||
self.check_data(RulesHelper().dash_dash_key_time(['timestamp,@-30m']), data, WARNING_CODE)
|
||||
self.check_data(RulesHelper().dash_dash_key_time_critical(['timestamp,@-1h']), data, CRITICAL_CODE)
|
||||
self.check_data(RulesHelper().dash_dash_key_time(['timestamp,@-3h']), data, WARNING_CODE)
|
||||
self.check_data(RulesHelper().dash_dash_key_time_critical(['timestamp,@-2d']), data, CRITICAL_CODE)
|
||||
|
||||
# Test 31 minute in the past.
|
||||
data = "{\"timestamp\": \"%s\"}" % (now - timedelta(minutes=31))
|
||||
self.check_data(RulesHelper().dash_dash_key_time(['timestamp,30m']), data, WARNING_CODE)
|
||||
self.check_data(RulesHelper().dash_dash_key_time_critical(['timestamp,1h']), data, OK_CODE)
|
||||
self.check_data(RulesHelper().dash_dash_key_time(['timestamp,3h']), data, OK_CODE)
|
||||
self.check_data(RulesHelper().dash_dash_key_time_critical(['timestamp,2d']), data, OK_CODE)
|
||||
|
||||
self.check_data(RulesHelper().dash_dash_key_time(['timestamp,@30m']), data, OK_CODE)
|
||||
self.check_data(RulesHelper().dash_dash_key_time_critical(['timestamp,@1h']), data, CRITICAL_CODE)
|
||||
self.check_data(RulesHelper().dash_dash_key_time(['timestamp,@3h']), data, WARNING_CODE)
|
||||
self.check_data(RulesHelper().dash_dash_key_time_critical(['timestamp,@2d']), data, CRITICAL_CODE)
|
||||
|
||||
self.check_data(RulesHelper().dash_dash_key_time(['timestamp,-30m']), data, OK_CODE)
|
||||
self.check_data(RulesHelper().dash_dash_key_time_critical(['timestamp,-1h']), data, OK_CODE)
|
||||
self.check_data(RulesHelper().dash_dash_key_time(['timestamp,-3h']), data, OK_CODE)
|
||||
self.check_data(RulesHelper().dash_dash_key_time_critical(['timestamp,-2d']), data, OK_CODE)
|
||||
|
||||
self.check_data(RulesHelper().dash_dash_key_time(['timestamp,@-30m']), data, WARNING_CODE)
|
||||
self.check_data(RulesHelper().dash_dash_key_time_critical(['timestamp,@-1h']), data, CRITICAL_CODE)
|
||||
self.check_data(RulesHelper().dash_dash_key_time(['timestamp,@-3h']), data, WARNING_CODE)
|
||||
self.check_data(RulesHelper().dash_dash_key_time_critical(['timestamp,@-2d']), data, CRITICAL_CODE)
|
||||
|
||||
# Test two hours and one minute in the past.
|
||||
data = "{\"timestamp\": \"%s\"}" % (now - timedelta(hours=2) - timedelta(minutes=1))
|
||||
self.check_data(RulesHelper().dash_dash_key_time(['timestamp,30m']), data, WARNING_CODE)
|
||||
self.check_data(RulesHelper().dash_dash_key_time_critical(['timestamp,1h']), data, CRITICAL_CODE)
|
||||
self.check_data(RulesHelper().dash_dash_key_time(['timestamp,3h']), data, OK_CODE)
|
||||
self.check_data(RulesHelper().dash_dash_key_time_critical(['timestamp,2d']), data, OK_CODE)
|
||||
|
||||
self.check_data(RulesHelper().dash_dash_key_time(['timestamp,@30m']), data, OK_CODE)
|
||||
self.check_data(RulesHelper().dash_dash_key_time_critical(['timestamp,@1h']), data, OK_CODE)
|
||||
self.check_data(RulesHelper().dash_dash_key_time(['timestamp,@3h']), data, WARNING_CODE)
|
||||
self.check_data(RulesHelper().dash_dash_key_time_critical(['timestamp,@2d']), data, CRITICAL_CODE)
|
||||
|
||||
self.check_data(RulesHelper().dash_dash_key_time(['timestamp,-30m']), data, OK_CODE)
|
||||
self.check_data(RulesHelper().dash_dash_key_time_critical(['timestamp,-1h']), data, OK_CODE)
|
||||
self.check_data(RulesHelper().dash_dash_key_time(['timestamp,-3h']), data, OK_CODE)
|
||||
self.check_data(RulesHelper().dash_dash_key_time_critical(['timestamp,-2d']), data, OK_CODE)
|
||||
|
||||
self.check_data(RulesHelper().dash_dash_key_time(['timestamp,@-30m']), data, WARNING_CODE)
|
||||
self.check_data(RulesHelper().dash_dash_key_time_critical(['timestamp,@-1h']), data, CRITICAL_CODE)
|
||||
self.check_data(RulesHelper().dash_dash_key_time(['timestamp,@-3h']), data, WARNING_CODE)
|
||||
self.check_data(RulesHelper().dash_dash_key_time_critical(['timestamp,@-2d']), data, CRITICAL_CODE)
|
||||
|
||||
# Test one day and one minute in the past.
|
||||
data = "{\"timestamp\": \"%s\"}" % (now - timedelta(days=1) - timedelta(minutes=1))
|
||||
self.check_data(RulesHelper().dash_dash_key_time(['timestamp,30m']), data, WARNING_CODE)
|
||||
self.check_data(RulesHelper().dash_dash_key_time_critical(['timestamp,1h']), data, CRITICAL_CODE)
|
||||
self.check_data(RulesHelper().dash_dash_key_time(['timestamp,3h']), data, WARNING_CODE)
|
||||
self.check_data(RulesHelper().dash_dash_key_time_critical(['timestamp,2d']), data, OK_CODE)
|
||||
|
||||
self.check_data(RulesHelper().dash_dash_key_time(['timestamp,@30m']), data, OK_CODE)
|
||||
self.check_data(RulesHelper().dash_dash_key_time_critical(['timestamp,@1h']), data, OK_CODE)
|
||||
self.check_data(RulesHelper().dash_dash_key_time(['timestamp,@3h']), data, OK_CODE)
|
||||
self.check_data(RulesHelper().dash_dash_key_time_critical(['timestamp,@2d']), data, CRITICAL_CODE)
|
||||
|
||||
self.check_data(RulesHelper().dash_dash_key_time(['timestamp,-30m']), data, OK_CODE)
|
||||
self.check_data(RulesHelper().dash_dash_key_time_critical(['timestamp,-1h']), data, OK_CODE)
|
||||
self.check_data(RulesHelper().dash_dash_key_time(['timestamp,-3h']), data, OK_CODE)
|
||||
self.check_data(RulesHelper().dash_dash_key_time_critical(['timestamp,-2d']), data, OK_CODE)
|
||||
|
||||
self.check_data(RulesHelper().dash_dash_key_time(['timestamp,@-30m']), data, WARNING_CODE)
|
||||
self.check_data(RulesHelper().dash_dash_key_time_critical(['timestamp,@-1h']), data, CRITICAL_CODE)
|
||||
self.check_data(RulesHelper().dash_dash_key_time(['timestamp,@-3h']), data, WARNING_CODE)
|
||||
self.check_data(RulesHelper().dash_dash_key_time_critical(['timestamp,@-2d']), data, CRITICAL_CODE)
|
||||
|
||||
# Test two hours and one minute in the future.
|
||||
data = "{\"timestamp\": \"%s\"}" % (now + timedelta(hours=2) + timedelta(minutes=1))
|
||||
self.check_data(RulesHelper().dash_dash_key_time(['timestamp,30m']), data, OK_CODE)
|
||||
self.check_data(RulesHelper().dash_dash_key_time_critical(['timestamp,1h']), data, OK_CODE)
|
||||
self.check_data(RulesHelper().dash_dash_key_time(['timestamp,3h']), data, OK_CODE)
|
||||
self.check_data(RulesHelper().dash_dash_key_time_critical(['timestamp,2d']), data, OK_CODE)
|
||||
|
||||
self.check_data(RulesHelper().dash_dash_key_time(['timestamp,@30m']), data, WARNING_CODE)
|
||||
self.check_data(RulesHelper().dash_dash_key_time_critical(['timestamp,@1h']), data, CRITICAL_CODE)
|
||||
self.check_data(RulesHelper().dash_dash_key_time(['timestamp,@3h']), data, WARNING_CODE)
|
||||
self.check_data(RulesHelper().dash_dash_key_time_critical(['timestamp,@2d']), data, CRITICAL_CODE)
|
||||
|
||||
self.check_data(RulesHelper().dash_dash_key_time(['timestamp,-30m']), data, WARNING_CODE)
|
||||
self.check_data(RulesHelper().dash_dash_key_time_critical(['timestamp,-1h']), data, CRITICAL_CODE)
|
||||
self.check_data(RulesHelper().dash_dash_key_time(['timestamp,-3h']), data, OK_CODE)
|
||||
self.check_data(RulesHelper().dash_dash_key_time_critical(['timestamp,-2d']), data, OK_CODE)
|
||||
|
||||
self.check_data(RulesHelper().dash_dash_key_time(['timestamp,@-30m']), data, OK_CODE)
|
||||
self.check_data(RulesHelper().dash_dash_key_time_critical(['timestamp,@-1h']), data, OK_CODE)
|
||||
self.check_data(RulesHelper().dash_dash_key_time(['timestamp,@-3h']), data, WARNING_CODE)
|
||||
self.check_data(RulesHelper().dash_dash_key_time_critical(['timestamp,@-2d']), data, CRITICAL_CODE)
|
||||
else:
|
||||
data = "{\"timestamp\": \"2020-01-01\"}"
|
||||
self.check_data(RulesHelper().dash_dash_key_time(['timestamp,30m']), data, WARNING_CODE)
|
||||
self.check_data(RulesHelper().dash_dash_key_time_critical(['timestamp,1h']), data, CRITICAL_CODE)
|
||||
self.check_data(RulesHelper().dash_dash_key_time(['timestamp,3h']), data, WARNING_CODE)
|
||||
self.check_data(RulesHelper().dash_dash_key_time_critical(['timestamp,2d']), data, CRITICAL_CODE)
|
||||
|
||||
self.check_data(RulesHelper().dash_dash_key_time(['timestamp,@30m']), data, OK_CODE)
|
||||
self.check_data(RulesHelper().dash_dash_key_time_critical(['timestamp,@1h']), data, OK_CODE)
|
||||
self.check_data(RulesHelper().dash_dash_key_time(['timestamp,@3h']), data, OK_CODE)
|
||||
self.check_data(RulesHelper().dash_dash_key_time_critical(['timestamp,@2d']), data, OK_CODE)
|
||||
|
||||
self.check_data(RulesHelper().dash_dash_key_time(['timestamp,-30m']), data, OK_CODE)
|
||||
self.check_data(RulesHelper().dash_dash_key_time_critical(['timestamp,-1h']), data, OK_CODE)
|
||||
self.check_data(RulesHelper().dash_dash_key_time(['timestamp,-3h']), data, OK_CODE)
|
||||
self.check_data(RulesHelper().dash_dash_key_time_critical(['timestamp,-2d']), data, OK_CODE)
|
||||
|
||||
self.check_data(RulesHelper().dash_dash_key_time(['timestamp,@-30m']), data, WARNING_CODE)
|
||||
self.check_data(RulesHelper().dash_dash_key_time_critical(['timestamp,@-1h']), data, CRITICAL_CODE)
|
||||
self.check_data(RulesHelper().dash_dash_key_time(['timestamp,@-3h']), data, WARNING_CODE)
|
||||
self.check_data(RulesHelper().dash_dash_key_time_critical(['timestamp,@-2d']), data, CRITICAL_CODE)
|
||||
|
65
test/testdata/README.md
vendored
Normal file
65
test/testdata/README.md
vendored
Normal file
@ -0,0 +1,65 @@
|
||||
# Example Data for Testing
|
||||
|
||||
Example calls:
|
||||
|
||||
```bash
|
||||
python check_http_json.py -H localhost:8080 -p data0.json -q "age,20"
|
||||
UNKNOWN: Status UNKNOWN. Could not find JSON in HTTP body.
|
||||
```
|
||||
|
||||
```bash
|
||||
python check_http_json.py -H localhost:8080 -p data1.json -e date
|
||||
WARNING: Status WARNING. Key date did not exist.
|
||||
|
||||
python check_http_json.py -H localhost:8080 -p data1.json -E age
|
||||
OK: Status OK.
|
||||
|
||||
python check_http_json.py -H localhost:8080 -p data1.json -w "age,30"
|
||||
OK: Status OK.
|
||||
|
||||
python check_http_json.py -H localhost:8080 -p data1.json -w "age,20"
|
||||
WARNING: Status WARNING. Value (30) for key age was outside the range 0:20.
|
||||
|
||||
python check_http_json.py -H localhost:8080 -p data1.json -q "age,20"
|
||||
WARNING: Status WARNING. Key age mismatch. 20 != 30
|
||||
```
|
||||
|
||||
```bash
|
||||
python check_http_json.py -H localhost:8080 -p data2.json -q "(1).id,123"
|
||||
WARNING: Status WARNING. Key (1).id mismatch. 123 != 2
|
||||
|
||||
python check_http_json.py -H localhost:8080 -p data2.json -Y "(1).id,2"
|
||||
CRITICAL: Status CRITICAL. Key (1).id match found. 2 == 2
|
||||
|
||||
python check_http_json.py -H localhost:8080 -p data2.json -E "(1).author"
|
||||
OK: Status OK.
|
||||
|
||||
python check_http_json.py -H localhost:8080 -p data2.json -E "(1).pages"
|
||||
CRITICAL: Status CRITICAL. Key (1).pages did not exist.
|
||||
```
|
||||
|
||||
```bash
|
||||
python check_http_json.py -H localhost:8080 -p data3.json -q "company.employees.(0).role,Developer"
|
||||
OK: Status OK.
|
||||
|
||||
python check_http_json.py -H localhost:8080 -p data3.json -q "company.employees.(0).role,Dev"
|
||||
WARNING: Status WARNING. Key company.employees.(0).role mismatch. Dev != Developer
|
||||
|
||||
python check_http_json.py -H localhost:8080 -p data3.json -q "company.employees.(0).role,Developer" "company.employees.(1).role,Designer"
|
||||
OK: Status OK.
|
||||
```
|
||||
|
||||
```bash
|
||||
python check_http_json.py -H localhost:8080 -p data4.json -u "ratings(0),4.5"
|
||||
OK: Status OK.
|
||||
|
||||
python check_http_json.py -H localhost:8080 -p data4.json -u "ratings(0),4.1"
|
||||
UNKNOWN: Status UNKNOWN. Key ratings(0) mismatch. 4.1 != 4.5
|
||||
```
|
||||
|
||||
```bash
|
||||
python check_http_json.py -H localhost:8080 -p data5.json -q service1.status,True service2.status,True service3.status,True
|
||||
OK: Status OK.
|
||||
|
||||
python check_http_json.py -H localhost:8080 -p data5.json -q "service1.status,True" -q "service2.status,True" -q "service3.status,False"
|
||||
```
|
1
test/testdata/data0-invalid.json
vendored
Normal file
1
test/testdata/data0-invalid.json
vendored
Normal file
@ -0,0 +1 @@
|
||||
No JSON
|
5
test/testdata/data1.json
vendored
Normal file
5
test/testdata/data1.json
vendored
Normal file
@ -0,0 +1,5 @@
|
||||
{
|
||||
"name": "John Doe",
|
||||
"age": 30,
|
||||
"city": "New York"
|
||||
}
|
17
test/testdata/data2.json
vendored
Normal file
17
test/testdata/data2.json
vendored
Normal file
@ -0,0 +1,17 @@
|
||||
[
|
||||
{
|
||||
"id": 1,
|
||||
"title": "Book One",
|
||||
"author": "Author One"
|
||||
},
|
||||
{
|
||||
"id": 2,
|
||||
"title": "Book Two",
|
||||
"author": "Author Two"
|
||||
},
|
||||
{
|
||||
"id": 3,
|
||||
"title": "Book Three",
|
||||
"author": "Author Three"
|
||||
}
|
||||
]
|
18
test/testdata/data3.json
vendored
Normal file
18
test/testdata/data3.json
vendored
Normal file
@ -0,0 +1,18 @@
|
||||
{
|
||||
"company": {
|
||||
"name": "Tech Corp",
|
||||
"location": "San Francisco",
|
||||
"employees": [
|
||||
{
|
||||
"name": "Alice",
|
||||
"role": "Developer"
|
||||
},
|
||||
{
|
||||
"name": "Bob",
|
||||
"role": "Designer"
|
||||
}
|
||||
]
|
||||
},
|
||||
"founded": 2010,
|
||||
"industry": "Technology"
|
||||
}
|
13
test/testdata/data4.json
vendored
Normal file
13
test/testdata/data4.json
vendored
Normal file
@ -0,0 +1,13 @@
|
||||
{
|
||||
"id": 123,
|
||||
"active": true,
|
||||
"tags": ["tech", "startup", "innovation"],
|
||||
"details": {
|
||||
"website": "https://example.com",
|
||||
"contact": {
|
||||
"email": "info@example.com",
|
||||
"phone": "+1-234-567-890"
|
||||
}
|
||||
},
|
||||
"ratings": [4.5, 4.7, 4.8]
|
||||
}
|
38
test/testdata/data5.json
vendored
Normal file
38
test/testdata/data5.json
vendored
Normal file
@ -0,0 +1,38 @@
|
||||
{
|
||||
"service1": {
|
||||
"status": true
|
||||
},
|
||||
"service2": {
|
||||
"status": true,
|
||||
"meta": {
|
||||
"res": "PONG"
|
||||
}
|
||||
},
|
||||
"service3": {
|
||||
"status": true,
|
||||
"meta": {
|
||||
"took": 9,
|
||||
"timed_out": false,
|
||||
"_shards": {
|
||||
"total": 0,
|
||||
"successful": 0,
|
||||
"skipped": 0,
|
||||
"failed": 0
|
||||
},
|
||||
"hits": {
|
||||
"total": {
|
||||
"value": 10000,
|
||||
"relation": "gte"
|
||||
},
|
||||
"max_score": null,
|
||||
"hits": []
|
||||
}
|
||||
}
|
||||
},
|
||||
"service4": {
|
||||
"status": true,
|
||||
"meta": {
|
||||
"status": "ok"
|
||||
}
|
||||
}
|
||||
}
|
7
test/testdata/docker-compose.yml
vendored
Normal file
7
test/testdata/docker-compose.yml
vendored
Normal file
@ -0,0 +1,7 @@
|
||||
services:
|
||||
nginx:
|
||||
image: nginx:1-alpine
|
||||
ports:
|
||||
- "8080:80"
|
||||
volumes:
|
||||
- ./:/usr/share/nginx/html
|
Loading…
Reference in New Issue
Block a user