mirror of
https://github.com/drewkerrigan/nagios-http-json.git
synced 2025-05-10 09:53:44 +02:00
commit
4e0d4e873b
166
README.md
166
README.md
@ -4,15 +4,50 @@
|
|||||||
|
|
||||||
This is a generic plugin for Nagios which checks json values from a given HTTP endpoint against argument specified rules and determines the status and performance data for that service.
|
This is a generic plugin for Nagios which checks json values from a given HTTP endpoint against argument specified rules and determines the status and performance data for that service.
|
||||||
|
|
||||||
## Links
|
## Installation
|
||||||
|
|
||||||
* [CLI Usage](#cli-usage)
|
Requirements:
|
||||||
* [Examples](#examples)
|
|
||||||
* [Riak Stats](docs/RIAK.md)
|
|
||||||
* [Docker](docs/DOCKER.md)
|
|
||||||
* [Nagios Installation](#nagios-installation)
|
|
||||||
|
|
||||||
## CLI Usage
|
* Python 3.6+
|
||||||
|
|
||||||
|
### Nagios
|
||||||
|
|
||||||
|
Assuming a standard installation of Nagios, the plugin can be executed from the machine that Nagios is running on.
|
||||||
|
|
||||||
|
```bash
|
||||||
|
cp check_http_json.py /usr/local/nagios/libexec/plugins/check_http_json.py
|
||||||
|
chmod +x /usr/local/nagios/libexec/plugins/check_http_json.py
|
||||||
|
```
|
||||||
|
|
||||||
|
Add the following service definition to your server config (`localhost.cfg`):
|
||||||
|
|
||||||
|
```
|
||||||
|
|
||||||
|
define service {
|
||||||
|
use local-service
|
||||||
|
host_name localhost
|
||||||
|
service_description <command_description>
|
||||||
|
check_command <command_name>
|
||||||
|
}
|
||||||
|
|
||||||
|
```
|
||||||
|
|
||||||
|
Add the following command definition to your commands config (`commands.config`):
|
||||||
|
|
||||||
|
```
|
||||||
|
|
||||||
|
define command{
|
||||||
|
command_name <command_name>
|
||||||
|
command_line /usr/bin/python /usr/local/nagios/libexec/plugins/check_http_json.py -H <host>:<port> -p <path> [-e|-q|-w|-c <rules>] [-m <metrics>]
|
||||||
|
}
|
||||||
|
|
||||||
|
```
|
||||||
|
|
||||||
|
### Icinga2
|
||||||
|
|
||||||
|
An example Icinga2 command definition can be found here: (`contrib/icinga2_check_command_definition.conf`)
|
||||||
|
|
||||||
|
## Usage
|
||||||
|
|
||||||
Executing `./check_http_json.py -h` will yield the following details:
|
Executing `./check_http_json.py -h` will yield the following details:
|
||||||
|
|
||||||
@ -58,7 +93,9 @@ options:
|
|||||||
-t TIMEOUT, --timeout TIMEOUT
|
-t TIMEOUT, --timeout TIMEOUT
|
||||||
Connection timeout (seconds)
|
Connection timeout (seconds)
|
||||||
--unreachable-state UNREACHABLE_STATE
|
--unreachable-state UNREACHABLE_STATE
|
||||||
Exit with specified code if URL unreachable. Examples: 1 for Warning, 2 for Critical, 3 for Unknown (default: 3)
|
Exit with specified code when the URL is unreachable. Examples: 1 for Warning, 2 for Critical, 3 for Unknown (default: 3)
|
||||||
|
--invalid-json-state INVALID_JSON_STATE
|
||||||
|
Exit with specified code when no valid JSON is returned. Examples: 1 for Warning, 2 for Critical, 3 for Unknown (default: 3)
|
||||||
-B AUTH, --basic-auth AUTH
|
-B AUTH, --basic-auth AUTH
|
||||||
Basic auth string "username:password"
|
Basic auth string "username:password"
|
||||||
-D DATA, --data DATA The http payload to send as a POST
|
-D DATA, --data DATA The http payload to send as a POST
|
||||||
@ -83,6 +120,18 @@ options:
|
|||||||
can be delimited with colon (key,value1:value2). Return warning if equality check fails
|
can be delimited with colon (key,value1:value2). Return warning if equality check fails
|
||||||
-Q [KEY_VALUE_LIST_CRITICAL ...], --key_equals_critical [KEY_VALUE_LIST_CRITICAL ...]
|
-Q [KEY_VALUE_LIST_CRITICAL ...], --key_equals_critical [KEY_VALUE_LIST_CRITICAL ...]
|
||||||
Same as -q but return critical if equality check fails.
|
Same as -q but return critical if equality check fails.
|
||||||
|
--key_time [KEY_TIME_LIST ...],
|
||||||
|
Checks a Timestamp of these keys and values
|
||||||
|
(key[>alias],value key2,value2) to determine status.
|
||||||
|
Multiple key values can be delimited with colon
|
||||||
|
(key,value1:value2). Return warning if the key is older
|
||||||
|
than the value (ex.: 30s,10m,2h,3d,...).
|
||||||
|
With at it return warning if the key is jounger
|
||||||
|
than the value (ex.: @30s,@10m,@2h,@3d,...).
|
||||||
|
With Minus you can shift the time in the future.
|
||||||
|
--key_time_critical [KEY_TIME_LIST_CRITICAL ...],
|
||||||
|
Same as --key_time but return critical if
|
||||||
|
Timestamp age fails.
|
||||||
-u [KEY_VALUE_LIST_UNKNOWN ...], --key_equals_unknown [KEY_VALUE_LIST_UNKNOWN ...]
|
-u [KEY_VALUE_LIST_UNKNOWN ...], --key_equals_unknown [KEY_VALUE_LIST_UNKNOWN ...]
|
||||||
Same as -q but return unknown if equality check fails.
|
Same as -q but return unknown if equality check fails.
|
||||||
-y [KEY_VALUE_LIST_NOT ...], --key_not_equals [KEY_VALUE_LIST_NOT ...]
|
-y [KEY_VALUE_LIST_NOT ...], --key_not_equals [KEY_VALUE_LIST_NOT ...]
|
||||||
@ -97,6 +146,8 @@ options:
|
|||||||
(key[>alias],UnitOfMeasure), (key[>alias],UnitOfMeasure,WarnRange, CriticalRange).
|
(key[>alias],UnitOfMeasure), (key[>alias],UnitOfMeasure,WarnRange, CriticalRange).
|
||||||
```
|
```
|
||||||
|
|
||||||
|
The check plugin respects the environment variables `HTTP_PROXY`, `HTTPS_PROXY`.
|
||||||
|
|
||||||
## Examples
|
## Examples
|
||||||
|
|
||||||
### Key Naming
|
### Key Naming
|
||||||
@ -160,6 +211,22 @@ options:
|
|||||||
]
|
]
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|
||||||
|
**Data for multiple keys for an object** `-q capacity1.value,True capacity2.value,True capacity3.value,True`
|
||||||
|
|
||||||
|
{
|
||||||
|
"capacity1": {
|
||||||
|
"value": true
|
||||||
|
},
|
||||||
|
"capacity2": {
|
||||||
|
"value": true
|
||||||
|
},
|
||||||
|
"capacity3": {
|
||||||
|
"value": true
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
### Thresholds and Ranges
|
### Thresholds and Ranges
|
||||||
|
|
||||||
**Data**:
|
**Data**:
|
||||||
@ -188,54 +255,49 @@ options:
|
|||||||
|
|
||||||
More info about Nagios Range format and Units of Measure can be found at [https://nagios-plugins.org/doc/guidelines.html](https://nagios-plugins.org/doc/guidelines.html).
|
More info about Nagios Range format and Units of Measure can be found at [https://nagios-plugins.org/doc/guidelines.html](https://nagios-plugins.org/doc/guidelines.html).
|
||||||
|
|
||||||
|
### Timestamp
|
||||||
|
|
||||||
|
**Data**:
|
||||||
|
|
||||||
|
{ "metric": "2020-01-01 10:10:00.000000+00:00" }
|
||||||
|
|
||||||
|
#### Relevant Commands
|
||||||
|
|
||||||
|
* **Warning:** `./check_http_json.py -H <host>:<port> -p <path> --key_time "metric,TIME"`
|
||||||
|
* **Critical:** `./check_http_json.py -H <host>:<port> -p <path> --key_time_critical "metric,TIME"`
|
||||||
|
|
||||||
|
#### TIME Definitions
|
||||||
|
|
||||||
|
* **Format:** [@][-]TIME
|
||||||
|
* **Generates a Warning or Critical if...**
|
||||||
|
* **Timestamp is more than 30 seconds in the past:** `30s`
|
||||||
|
* **Timestamp is more than 5 minutes in the past:** `5m`
|
||||||
|
* **Timestamp is more than 12 hours in the past:** `12h`
|
||||||
|
* **Timestamp is more than 2 days in the past:** `2d`
|
||||||
|
* **Timestamp is more than 30 minutes in the future:** `-30m`
|
||||||
|
* **Timestamp is not more than 30 minutes in the future:** `@-30m`
|
||||||
|
* **Timestamp is not more than 30 minutes in the past:** `@30m`
|
||||||
|
|
||||||
|
##### Timestamp Format
|
||||||
|
|
||||||
|
This plugin uses the Python function 'datetime.fromisoformat'.
|
||||||
|
Since Python 3.11 any valid ISO 8601 format is supported, with the following exceptions:
|
||||||
|
|
||||||
|
* Time zone offsets may have fractional seconds.
|
||||||
|
* The T separator may be replaced by any single unicode character.
|
||||||
|
* Fractional hours and minutes are not supported.
|
||||||
|
* Reduced precision dates are not currently supported (YYYY-MM, YYYY).
|
||||||
|
* Extended date representations are not currently supported (±YYYYYY-MM-DD).
|
||||||
|
* Ordinal dates are not currently supported (YYYY-OOO).
|
||||||
|
|
||||||
|
Before Python 3.11, this method only supported the format YYYY-MM-DD
|
||||||
|
|
||||||
|
More info and examples the about Timestamp Format can be found at [https://docs.python.org/3/library/datetime.html#datetime.datetime.fromisoformat](https://docs.python.org/3/library/datetime.html#datetime.datetime.fromisoformat).
|
||||||
|
|
||||||
#### Using Headers
|
#### Using Headers
|
||||||
|
|
||||||
* `./check_http_json.py -H <host>:<port> -p <path> -A '{"content-type": "application/json"}' -w "metric,RANGE"`
|
* `./check_http_json.py -H <host>:<port> -p <path> -A '{"content-type": "application/json"}' -w "metric,RANGE"`
|
||||||
|
|
||||||
## Nagios Installation
|
|
||||||
|
|
||||||
### Requirements
|
|
||||||
|
|
||||||
* Python 3.6+
|
|
||||||
|
|
||||||
### Configuration
|
|
||||||
|
|
||||||
Assuming a standard installation of Nagios, the plugin can be executed from the machine that Nagios is running on.
|
|
||||||
|
|
||||||
```bash
|
|
||||||
cp check_http_json.py /usr/local/nagios/libexec/plugins/check_http_json.py
|
|
||||||
chmod +x /usr/local/nagios/libexec/plugins/check_http_json.py
|
|
||||||
```
|
|
||||||
|
|
||||||
Add the following service definition to your server config (`localhost.cfg`):
|
|
||||||
|
|
||||||
```
|
|
||||||
|
|
||||||
define service {
|
|
||||||
use local-service
|
|
||||||
host_name localhost
|
|
||||||
service_description <command_description>
|
|
||||||
check_command <command_name>
|
|
||||||
}
|
|
||||||
|
|
||||||
```
|
|
||||||
|
|
||||||
Add the following command definition to your commands config (`commands.config`):
|
|
||||||
|
|
||||||
```
|
|
||||||
|
|
||||||
define command{
|
|
||||||
command_name <command_name>
|
|
||||||
command_line /usr/bin/python /usr/local/nagios/libexec/plugins/check_http_json.py -H <host>:<port> -p <path> [-e|-q|-w|-c <rules>] [-m <metrics>]
|
|
||||||
}
|
|
||||||
|
|
||||||
```
|
|
||||||
|
|
||||||
## Icinga2 configuration
|
|
||||||
|
|
||||||
The Icinga2 command definition can be found here: (contrib/icinga2_check_command_definition.conf)
|
|
||||||
|
|
||||||
|
|
||||||
## License
|
## License
|
||||||
|
|
||||||
Copyright 2014-2015 Drew Kerrigan.
|
Copyright 2014-2015 Drew Kerrigan.
|
||||||
|
@ -6,8 +6,10 @@ import json
|
|||||||
import argparse
|
import argparse
|
||||||
import sys
|
import sys
|
||||||
import ssl
|
import ssl
|
||||||
|
import traceback
|
||||||
from urllib.error import HTTPError
|
from urllib.error import HTTPError
|
||||||
from urllib.error import URLError
|
from urllib.error import URLError
|
||||||
|
from datetime import datetime, timedelta, timezone
|
||||||
|
|
||||||
plugin_description = \
|
plugin_description = \
|
||||||
"""
|
"""
|
||||||
@ -23,8 +25,8 @@ WARNING_CODE = 1
|
|||||||
CRITICAL_CODE = 2
|
CRITICAL_CODE = 2
|
||||||
UNKNOWN_CODE = 3
|
UNKNOWN_CODE = 3
|
||||||
|
|
||||||
__version__ = '2.2.0'
|
__version__ = '2.3.0'
|
||||||
__version_date__ = '2024-05-14'
|
__version_date__ = '2025-04-11'
|
||||||
|
|
||||||
class NagiosHelper:
|
class NagiosHelper:
|
||||||
"""
|
"""
|
||||||
@ -234,11 +236,13 @@ class JsonRuleProcessor:
|
|||||||
self.key_value_list = self.expandKeys(self.rules.key_value_list)
|
self.key_value_list = self.expandKeys(self.rules.key_value_list)
|
||||||
self.key_value_list_not = self.expandKeys(
|
self.key_value_list_not = self.expandKeys(
|
||||||
self.rules.key_value_list_not)
|
self.rules.key_value_list_not)
|
||||||
|
self.key_time_list = self.expandKeys(self.rules.key_time_list)
|
||||||
self.key_list = self.expandKeys(self.rules.key_list)
|
self.key_list = self.expandKeys(self.rules.key_list)
|
||||||
self.key_value_list_critical = self.expandKeys(
|
self.key_value_list_critical = self.expandKeys(
|
||||||
self.rules.key_value_list_critical)
|
self.rules.key_value_list_critical)
|
||||||
self.key_value_list_not_critical = self.expandKeys(
|
self.key_value_list_not_critical = self.expandKeys(
|
||||||
self.rules.key_value_list_not_critical)
|
self.rules.key_value_list_not_critical)
|
||||||
|
self.key_time_list_critical = self.expandKeys(self.rules.key_time_list_critical)
|
||||||
self.key_list_critical = self.expandKeys(self.rules.key_list_critical)
|
self.key_list_critical = self.expandKeys(self.rules.key_list_critical)
|
||||||
self.key_value_list_unknown = self.expandKeys(
|
self.key_value_list_unknown = self.expandKeys(
|
||||||
self.rules.key_value_list_unknown)
|
self.rules.key_value_list_unknown)
|
||||||
@ -330,6 +334,72 @@ class JsonRuleProcessor:
|
|||||||
failure += self.checkThreshold(key, alias, r)
|
failure += self.checkThreshold(key, alias, r)
|
||||||
return failure
|
return failure
|
||||||
|
|
||||||
|
def checkTimestamp(self, key, alias, r):
|
||||||
|
failure = ''
|
||||||
|
invert = False
|
||||||
|
negative = False
|
||||||
|
if r.startswith('@'):
|
||||||
|
invert = True
|
||||||
|
r = r[1:]
|
||||||
|
if r.startswith('-'):
|
||||||
|
negative = True
|
||||||
|
r = r[1:]
|
||||||
|
duration = int(r[:-1])
|
||||||
|
unit = r[-1]
|
||||||
|
|
||||||
|
if unit == 's':
|
||||||
|
tiemduration = timedelta(seconds=duration)
|
||||||
|
elif unit == 'm':
|
||||||
|
tiemduration = timedelta(minutes=duration)
|
||||||
|
elif unit == 'h':
|
||||||
|
tiemduration = timedelta(hours=duration)
|
||||||
|
elif unit == 'd':
|
||||||
|
tiemduration = timedelta(days=duration)
|
||||||
|
else:
|
||||||
|
return " Value (%s) is not a vaild timeduration." % (r)
|
||||||
|
|
||||||
|
if not self.helper.exists(key):
|
||||||
|
return " Key (%s) for key %s not Exists." % \
|
||||||
|
(key, alias)
|
||||||
|
|
||||||
|
try:
|
||||||
|
timestamp = datetime.fromisoformat(self.helper.get(key))
|
||||||
|
except ValueError as ve:
|
||||||
|
return " Value (%s) for key %s is not a Date in ISO format. %s" % \
|
||||||
|
(self.helper.get(key), alias, ve)
|
||||||
|
|
||||||
|
now = datetime.now(timezone.utc)
|
||||||
|
|
||||||
|
if timestamp.tzinfo is None:
|
||||||
|
timestamp = timestamp.replace(tzinfo=timezone.utc)
|
||||||
|
|
||||||
|
age = now - timestamp
|
||||||
|
|
||||||
|
if not negative:
|
||||||
|
if age > tiemduration and not invert:
|
||||||
|
failure += " Value (%s) for key %s is older than now-%s%s." % \
|
||||||
|
(self.helper.get(key), alias, duration, unit)
|
||||||
|
if not age > tiemduration and invert:
|
||||||
|
failure += " Value (%s) for key %s is newer than now-%s%s." % \
|
||||||
|
(self.helper.get(key), alias, duration, unit)
|
||||||
|
else:
|
||||||
|
if age < -tiemduration and not invert:
|
||||||
|
failure += " Value (%s) for key %s is newer than now+%s%s." % \
|
||||||
|
(self.helper.get(key), alias, duration, unit)
|
||||||
|
if not age < -tiemduration and invert:
|
||||||
|
failure += " Value (%s) for key %s is older than now+%s%s.." % \
|
||||||
|
(self.helper.get(key), alias, duration, unit)
|
||||||
|
|
||||||
|
return failure
|
||||||
|
|
||||||
|
def checkTimestamps(self, threshold_list):
|
||||||
|
failure = ''
|
||||||
|
for threshold in threshold_list:
|
||||||
|
k, r = threshold.split(',')
|
||||||
|
key, alias = _getKeyAlias(k)
|
||||||
|
failure += self.checkTimestamp(key, alias, r)
|
||||||
|
return failure
|
||||||
|
|
||||||
def checkWarning(self):
|
def checkWarning(self):
|
||||||
failure = ''
|
failure = ''
|
||||||
if self.key_threshold_warning is not None:
|
if self.key_threshold_warning is not None:
|
||||||
@ -338,6 +408,8 @@ class JsonRuleProcessor:
|
|||||||
failure += self.checkEquality(self.key_value_list)
|
failure += self.checkEquality(self.key_value_list)
|
||||||
if self.key_value_list_not is not None:
|
if self.key_value_list_not is not None:
|
||||||
failure += self.checkNonEquality(self.key_value_list_not)
|
failure += self.checkNonEquality(self.key_value_list_not)
|
||||||
|
if self.key_time_list is not None:
|
||||||
|
failure += self.checkTimestamps(self.key_time_list)
|
||||||
if self.key_list is not None:
|
if self.key_list is not None:
|
||||||
failure += self.checkExists(self.key_list)
|
failure += self.checkExists(self.key_list)
|
||||||
return failure
|
return failure
|
||||||
@ -352,6 +424,8 @@ class JsonRuleProcessor:
|
|||||||
failure += self.checkEquality(self.key_value_list_critical)
|
failure += self.checkEquality(self.key_value_list_critical)
|
||||||
if self.key_value_list_not_critical is not None:
|
if self.key_value_list_not_critical is not None:
|
||||||
failure += self.checkNonEquality(self.key_value_list_not_critical)
|
failure += self.checkNonEquality(self.key_value_list_not_critical)
|
||||||
|
if self.key_time_list_critical is not None:
|
||||||
|
failure += self.checkTimestamps(self.key_time_list_critical)
|
||||||
if self.key_list_critical is not None:
|
if self.key_list_critical is not None:
|
||||||
failure += self.checkExists(self.key_list_critical)
|
failure += self.checkExists(self.key_list_critical)
|
||||||
return failure
|
return failure
|
||||||
@ -446,7 +520,9 @@ def parseArgs(args):
|
|||||||
parser.add_argument('-t', '--timeout', type=int,
|
parser.add_argument('-t', '--timeout', type=int,
|
||||||
help='Connection timeout (seconds)')
|
help='Connection timeout (seconds)')
|
||||||
parser.add_argument('--unreachable-state', type=int, default=3,
|
parser.add_argument('--unreachable-state', type=int, default=3,
|
||||||
help='Exit with specified code if URL unreachable. Examples: 1 for Warning, 2 for Critical, 3 for Unknown (default: 3)')
|
help='Exit with specified code when the URL is unreachable. Examples: 1 for Warning, 2 for Critical, 3 for Unknown (default: 3)')
|
||||||
|
parser.add_argument('--invalid-json-state', type=int, default=3,
|
||||||
|
help='Exit with specified code when no valid JSON is returned. Examples: 1 for Warning, 2 for Critical, 3 for Unknown (default: 3)')
|
||||||
parser.add_argument('-B', '--basic-auth', dest='auth',
|
parser.add_argument('-B', '--basic-auth', dest='auth',
|
||||||
help='Basic auth string "username:password"')
|
help='Basic auth string "username:password"')
|
||||||
parser.add_argument('-D', '--data', dest='data',
|
parser.add_argument('-D', '--data', dest='data',
|
||||||
@ -490,6 +566,19 @@ def parseArgs(args):
|
|||||||
dest='key_value_list_critical', nargs='*',
|
dest='key_value_list_critical', nargs='*',
|
||||||
help='''Same as -q but return critical if
|
help='''Same as -q but return critical if
|
||||||
equality check fails.''')
|
equality check fails.''')
|
||||||
|
parser.add_argument('--key_time', dest='key_time_list', nargs='*',
|
||||||
|
help='''Checks a Timestamp of these keys and values
|
||||||
|
(key[>alias],value key2,value2) to determine status.
|
||||||
|
Multiple key values can be delimited with colon
|
||||||
|
(key,value1:value2). Return warning if the key is older
|
||||||
|
than the value (ex.: 30s,10m,2h,3d,...).
|
||||||
|
With at it return warning if the key is jounger
|
||||||
|
than the value (ex.: @30s,@10m,@2h,@3d,...).
|
||||||
|
With Minus you can shift the time in the future.''')
|
||||||
|
parser.add_argument('--key_time_critical',
|
||||||
|
dest='key_time_list_critical', nargs='*',
|
||||||
|
help='''Same as --key_time but return critical if
|
||||||
|
Timestamp age fails.''')
|
||||||
parser.add_argument('-u', '--key_equals_unknown',
|
parser.add_argument('-u', '--key_equals_unknown',
|
||||||
dest='key_value_list_unknown', nargs='*',
|
dest='key_value_list_unknown', nargs='*',
|
||||||
help='''Same as -q but return unknown if
|
help='''Same as -q but return unknown if
|
||||||
@ -634,13 +723,15 @@ def main(cliargs):
|
|||||||
json_data = ''
|
json_data = ''
|
||||||
|
|
||||||
try:
|
try:
|
||||||
|
# Requesting the data from the URL
|
||||||
json_data = make_request(args, url, context)
|
json_data = make_request(args, url, context)
|
||||||
except HTTPError as e:
|
except HTTPError as e:
|
||||||
# Try to recover from HTTP Error, if there is JSON in the response
|
# Try to recover from HTTP Error, if there is JSON in the response
|
||||||
if "json" in e.info().get_content_subtype():
|
if "json" in e.info().get_content_subtype():
|
||||||
json_data = e.read()
|
json_data = e.read()
|
||||||
else:
|
else:
|
||||||
nagios.append_message(UNKNOWN_CODE, " Could not find JSON in HTTP body. HTTPError[%s], url:%s" % (str(e.code), url))
|
exit_code = args.invalid_json_state
|
||||||
|
nagios.append_message(exit_code, " Could not find JSON in HTTP body. HTTPError[%s], url:%s" % (str(e.code), url))
|
||||||
except URLError as e:
|
except URLError as e:
|
||||||
# Some users might prefer another exit code if the URL wasn't reached
|
# Some users might prefer another exit code if the URL wasn't reached
|
||||||
exit_code = args.unreachable_state
|
exit_code = args.unreachable_state
|
||||||
@ -650,23 +741,32 @@ def main(cliargs):
|
|||||||
sys.exit(nagios.getCode())
|
sys.exit(nagios.getCode())
|
||||||
|
|
||||||
try:
|
try:
|
||||||
|
# Loading the JSON data from the request
|
||||||
data = json.loads(json_data)
|
data = json.loads(json_data)
|
||||||
except ValueError as e:
|
except ValueError as e:
|
||||||
nagios.append_message(UNKNOWN_CODE, " JSON Parser error: %s" % str(e))
|
exit_code = args.invalid_json_state
|
||||||
|
debugPrint(args.debug, traceback.format_exc())
|
||||||
|
nagios.append_message(exit_code, " JSON Parser error: %s" % str(e))
|
||||||
|
print(nagios.getMessage())
|
||||||
|
sys.exit(nagios.getCode())
|
||||||
else:
|
else:
|
||||||
verbosePrint(args.verbose, 1, json.dumps(data, indent=2))
|
verbosePrint(args.verbose, 1, json.dumps(data, indent=2))
|
||||||
# Apply rules to returned JSON data
|
|
||||||
|
try:
|
||||||
|
# Applying rules to returned JSON data
|
||||||
processor = JsonRuleProcessor(data, args)
|
processor = JsonRuleProcessor(data, args)
|
||||||
nagios.append_message(WARNING_CODE, processor.checkWarning())
|
nagios.append_message(WARNING_CODE, processor.checkWarning())
|
||||||
nagios.append_message(CRITICAL_CODE, processor.checkCritical())
|
nagios.append_message(CRITICAL_CODE, processor.checkCritical())
|
||||||
nagios.append_metrics(processor.checkMetrics())
|
nagios.append_metrics(processor.checkMetrics())
|
||||||
nagios.append_message(UNKNOWN_CODE, processor.checkUnknown())
|
nagios.append_message(UNKNOWN_CODE, processor.checkUnknown())
|
||||||
|
except Exception as e: # pylint: disable=broad-exception-caught
|
||||||
|
debugPrint(args.debug, traceback.format_exc())
|
||||||
|
nagios.append_message(UNKNOWN_CODE, " Rule Parser error: %s" % str(e))
|
||||||
|
|
||||||
# Print Nagios specific string and exit appropriately
|
# Print Nagios specific string and exit appropriately
|
||||||
print(nagios.getMessage())
|
print(nagios.getMessage())
|
||||||
sys.exit(nagios.getCode())
|
sys.exit(nagios.getCode())
|
||||||
|
|
||||||
|
|
||||||
if __name__ == "__main__":
|
if __name__ == "__main__":
|
||||||
# Program entry point
|
# Program entry point
|
||||||
main(sys.argv[1:])
|
main(sys.argv[1:])
|
||||||
|
@ -28,6 +28,8 @@ class RulesHelper:
|
|||||||
key_threshold_critical = None
|
key_threshold_critical = None
|
||||||
key_value_list_critical = None
|
key_value_list_critical = None
|
||||||
key_value_list_not_critical = None
|
key_value_list_not_critical = None
|
||||||
|
key_time_list = None
|
||||||
|
key_time_list_critical = None
|
||||||
key_value_list_unknown = None
|
key_value_list_unknown = None
|
||||||
key_list_critical = None
|
key_list_critical = None
|
||||||
metric_list = None
|
metric_list = None
|
||||||
@ -71,7 +73,14 @@ class RulesHelper:
|
|||||||
def dash_c(self, data):
|
def dash_c(self, data):
|
||||||
self.key_threshold_critical = data
|
self.key_threshold_critical = data
|
||||||
return self
|
return self
|
||||||
|
|
||||||
|
def dash_dash_key_time(self, data):
|
||||||
|
self.key_time_list = data
|
||||||
|
return self
|
||||||
|
|
||||||
|
def dash_dash_key_time_critical(self, data):
|
||||||
|
self.key_time_list_critical = data
|
||||||
|
return self
|
||||||
|
|
||||||
class UtilTest(unittest.TestCase):
|
class UtilTest(unittest.TestCase):
|
||||||
"""
|
"""
|
||||||
@ -302,3 +311,140 @@ class UtilTest(unittest.TestCase):
|
|||||||
# This should throw an error
|
# This should throw an error
|
||||||
data = '[]'
|
data = '[]'
|
||||||
self.check_data(rules.dash_q(['(*).update_status,warn_me']), data, CRITICAL_CODE)
|
self.check_data(rules.dash_q(['(*).update_status,warn_me']), data, CRITICAL_CODE)
|
||||||
|
|
||||||
|
def test_key_time(self):
|
||||||
|
if sys.version_info[1] >= 11:
|
||||||
|
# Test current timestamp.
|
||||||
|
now = datetime.now(timezone.utc)
|
||||||
|
data = "{\"timestamp\": \"%s\",\"timestamp2\": \"%s\"}" % (now, now)
|
||||||
|
|
||||||
|
self.check_data(RulesHelper().dash_dash_key_time(['timestamp,30s', 'timestamp2,30s']), data, OK_CODE)
|
||||||
|
|
||||||
|
self.check_data(RulesHelper().dash_dash_key_time(['timestamp,30m']), data, OK_CODE)
|
||||||
|
self.check_data(RulesHelper().dash_dash_key_time_critical(['timestamp,1h']), data, OK_CODE)
|
||||||
|
self.check_data(RulesHelper().dash_dash_key_time(['timestamp,3h']), data, OK_CODE)
|
||||||
|
self.check_data(RulesHelper().dash_dash_key_time_critical(['timestamp,2d']), data, OK_CODE)
|
||||||
|
|
||||||
|
self.check_data(RulesHelper().dash_dash_key_time(['timestamp,@30m']), data, WARNING_CODE)
|
||||||
|
self.check_data(RulesHelper().dash_dash_key_time_critical(['timestamp,@1h']), data, CRITICAL_CODE)
|
||||||
|
self.check_data(RulesHelper().dash_dash_key_time(['timestamp,@3h']), data, WARNING_CODE)
|
||||||
|
self.check_data(RulesHelper().dash_dash_key_time_critical(['timestamp,@2d']), data, CRITICAL_CODE)
|
||||||
|
|
||||||
|
self.check_data(RulesHelper().dash_dash_key_time(['timestamp,-30m']), data, OK_CODE)
|
||||||
|
self.check_data(RulesHelper().dash_dash_key_time_critical(['timestamp,-1h']), data, OK_CODE)
|
||||||
|
self.check_data(RulesHelper().dash_dash_key_time(['timestamp,-3h']), data, OK_CODE)
|
||||||
|
self.check_data(RulesHelper().dash_dash_key_time_critical(['timestamp,-2d']), data, OK_CODE)
|
||||||
|
|
||||||
|
self.check_data(RulesHelper().dash_dash_key_time(['timestamp,@-30m']), data, WARNING_CODE)
|
||||||
|
self.check_data(RulesHelper().dash_dash_key_time_critical(['timestamp,@-1h']), data, CRITICAL_CODE)
|
||||||
|
self.check_data(RulesHelper().dash_dash_key_time(['timestamp,@-3h']), data, WARNING_CODE)
|
||||||
|
self.check_data(RulesHelper().dash_dash_key_time_critical(['timestamp,@-2d']), data, CRITICAL_CODE)
|
||||||
|
|
||||||
|
# Test 31 minute in the past.
|
||||||
|
data = "{\"timestamp\": \"%s\"}" % (now - timedelta(minutes=31))
|
||||||
|
self.check_data(RulesHelper().dash_dash_key_time(['timestamp,30m']), data, WARNING_CODE)
|
||||||
|
self.check_data(RulesHelper().dash_dash_key_time_critical(['timestamp,1h']), data, OK_CODE)
|
||||||
|
self.check_data(RulesHelper().dash_dash_key_time(['timestamp,3h']), data, OK_CODE)
|
||||||
|
self.check_data(RulesHelper().dash_dash_key_time_critical(['timestamp,2d']), data, OK_CODE)
|
||||||
|
|
||||||
|
self.check_data(RulesHelper().dash_dash_key_time(['timestamp,@30m']), data, OK_CODE)
|
||||||
|
self.check_data(RulesHelper().dash_dash_key_time_critical(['timestamp,@1h']), data, CRITICAL_CODE)
|
||||||
|
self.check_data(RulesHelper().dash_dash_key_time(['timestamp,@3h']), data, WARNING_CODE)
|
||||||
|
self.check_data(RulesHelper().dash_dash_key_time_critical(['timestamp,@2d']), data, CRITICAL_CODE)
|
||||||
|
|
||||||
|
self.check_data(RulesHelper().dash_dash_key_time(['timestamp,-30m']), data, OK_CODE)
|
||||||
|
self.check_data(RulesHelper().dash_dash_key_time_critical(['timestamp,-1h']), data, OK_CODE)
|
||||||
|
self.check_data(RulesHelper().dash_dash_key_time(['timestamp,-3h']), data, OK_CODE)
|
||||||
|
self.check_data(RulesHelper().dash_dash_key_time_critical(['timestamp,-2d']), data, OK_CODE)
|
||||||
|
|
||||||
|
self.check_data(RulesHelper().dash_dash_key_time(['timestamp,@-30m']), data, WARNING_CODE)
|
||||||
|
self.check_data(RulesHelper().dash_dash_key_time_critical(['timestamp,@-1h']), data, CRITICAL_CODE)
|
||||||
|
self.check_data(RulesHelper().dash_dash_key_time(['timestamp,@-3h']), data, WARNING_CODE)
|
||||||
|
self.check_data(RulesHelper().dash_dash_key_time_critical(['timestamp,@-2d']), data, CRITICAL_CODE)
|
||||||
|
|
||||||
|
# Test two hours and one minute in the past.
|
||||||
|
data = "{\"timestamp\": \"%s\"}" % (now - timedelta(hours=2) - timedelta(minutes=1))
|
||||||
|
self.check_data(RulesHelper().dash_dash_key_time(['timestamp,30m']), data, WARNING_CODE)
|
||||||
|
self.check_data(RulesHelper().dash_dash_key_time_critical(['timestamp,1h']), data, CRITICAL_CODE)
|
||||||
|
self.check_data(RulesHelper().dash_dash_key_time(['timestamp,3h']), data, OK_CODE)
|
||||||
|
self.check_data(RulesHelper().dash_dash_key_time_critical(['timestamp,2d']), data, OK_CODE)
|
||||||
|
|
||||||
|
self.check_data(RulesHelper().dash_dash_key_time(['timestamp,@30m']), data, OK_CODE)
|
||||||
|
self.check_data(RulesHelper().dash_dash_key_time_critical(['timestamp,@1h']), data, OK_CODE)
|
||||||
|
self.check_data(RulesHelper().dash_dash_key_time(['timestamp,@3h']), data, WARNING_CODE)
|
||||||
|
self.check_data(RulesHelper().dash_dash_key_time_critical(['timestamp,@2d']), data, CRITICAL_CODE)
|
||||||
|
|
||||||
|
self.check_data(RulesHelper().dash_dash_key_time(['timestamp,-30m']), data, OK_CODE)
|
||||||
|
self.check_data(RulesHelper().dash_dash_key_time_critical(['timestamp,-1h']), data, OK_CODE)
|
||||||
|
self.check_data(RulesHelper().dash_dash_key_time(['timestamp,-3h']), data, OK_CODE)
|
||||||
|
self.check_data(RulesHelper().dash_dash_key_time_critical(['timestamp,-2d']), data, OK_CODE)
|
||||||
|
|
||||||
|
self.check_data(RulesHelper().dash_dash_key_time(['timestamp,@-30m']), data, WARNING_CODE)
|
||||||
|
self.check_data(RulesHelper().dash_dash_key_time_critical(['timestamp,@-1h']), data, CRITICAL_CODE)
|
||||||
|
self.check_data(RulesHelper().dash_dash_key_time(['timestamp,@-3h']), data, WARNING_CODE)
|
||||||
|
self.check_data(RulesHelper().dash_dash_key_time_critical(['timestamp,@-2d']), data, CRITICAL_CODE)
|
||||||
|
|
||||||
|
# Test one day and one minute in the past.
|
||||||
|
data = "{\"timestamp\": \"%s\"}" % (now - timedelta(days=1) - timedelta(minutes=1))
|
||||||
|
self.check_data(RulesHelper().dash_dash_key_time(['timestamp,30m']), data, WARNING_CODE)
|
||||||
|
self.check_data(RulesHelper().dash_dash_key_time_critical(['timestamp,1h']), data, CRITICAL_CODE)
|
||||||
|
self.check_data(RulesHelper().dash_dash_key_time(['timestamp,3h']), data, WARNING_CODE)
|
||||||
|
self.check_data(RulesHelper().dash_dash_key_time_critical(['timestamp,2d']), data, OK_CODE)
|
||||||
|
|
||||||
|
self.check_data(RulesHelper().dash_dash_key_time(['timestamp,@30m']), data, OK_CODE)
|
||||||
|
self.check_data(RulesHelper().dash_dash_key_time_critical(['timestamp,@1h']), data, OK_CODE)
|
||||||
|
self.check_data(RulesHelper().dash_dash_key_time(['timestamp,@3h']), data, OK_CODE)
|
||||||
|
self.check_data(RulesHelper().dash_dash_key_time_critical(['timestamp,@2d']), data, CRITICAL_CODE)
|
||||||
|
|
||||||
|
self.check_data(RulesHelper().dash_dash_key_time(['timestamp,-30m']), data, OK_CODE)
|
||||||
|
self.check_data(RulesHelper().dash_dash_key_time_critical(['timestamp,-1h']), data, OK_CODE)
|
||||||
|
self.check_data(RulesHelper().dash_dash_key_time(['timestamp,-3h']), data, OK_CODE)
|
||||||
|
self.check_data(RulesHelper().dash_dash_key_time_critical(['timestamp,-2d']), data, OK_CODE)
|
||||||
|
|
||||||
|
self.check_data(RulesHelper().dash_dash_key_time(['timestamp,@-30m']), data, WARNING_CODE)
|
||||||
|
self.check_data(RulesHelper().dash_dash_key_time_critical(['timestamp,@-1h']), data, CRITICAL_CODE)
|
||||||
|
self.check_data(RulesHelper().dash_dash_key_time(['timestamp,@-3h']), data, WARNING_CODE)
|
||||||
|
self.check_data(RulesHelper().dash_dash_key_time_critical(['timestamp,@-2d']), data, CRITICAL_CODE)
|
||||||
|
|
||||||
|
# Test two hours and one minute in the future.
|
||||||
|
data = "{\"timestamp\": \"%s\"}" % (now + timedelta(hours=2) + timedelta(minutes=1))
|
||||||
|
self.check_data(RulesHelper().dash_dash_key_time(['timestamp,30m']), data, OK_CODE)
|
||||||
|
self.check_data(RulesHelper().dash_dash_key_time_critical(['timestamp,1h']), data, OK_CODE)
|
||||||
|
self.check_data(RulesHelper().dash_dash_key_time(['timestamp,3h']), data, OK_CODE)
|
||||||
|
self.check_data(RulesHelper().dash_dash_key_time_critical(['timestamp,2d']), data, OK_CODE)
|
||||||
|
|
||||||
|
self.check_data(RulesHelper().dash_dash_key_time(['timestamp,@30m']), data, WARNING_CODE)
|
||||||
|
self.check_data(RulesHelper().dash_dash_key_time_critical(['timestamp,@1h']), data, CRITICAL_CODE)
|
||||||
|
self.check_data(RulesHelper().dash_dash_key_time(['timestamp,@3h']), data, WARNING_CODE)
|
||||||
|
self.check_data(RulesHelper().dash_dash_key_time_critical(['timestamp,@2d']), data, CRITICAL_CODE)
|
||||||
|
|
||||||
|
self.check_data(RulesHelper().dash_dash_key_time(['timestamp,-30m']), data, WARNING_CODE)
|
||||||
|
self.check_data(RulesHelper().dash_dash_key_time_critical(['timestamp,-1h']), data, CRITICAL_CODE)
|
||||||
|
self.check_data(RulesHelper().dash_dash_key_time(['timestamp,-3h']), data, OK_CODE)
|
||||||
|
self.check_data(RulesHelper().dash_dash_key_time_critical(['timestamp,-2d']), data, OK_CODE)
|
||||||
|
|
||||||
|
self.check_data(RulesHelper().dash_dash_key_time(['timestamp,@-30m']), data, OK_CODE)
|
||||||
|
self.check_data(RulesHelper().dash_dash_key_time_critical(['timestamp,@-1h']), data, OK_CODE)
|
||||||
|
self.check_data(RulesHelper().dash_dash_key_time(['timestamp,@-3h']), data, WARNING_CODE)
|
||||||
|
self.check_data(RulesHelper().dash_dash_key_time_critical(['timestamp,@-2d']), data, CRITICAL_CODE)
|
||||||
|
else:
|
||||||
|
data = "{\"timestamp\": \"2020-01-01\"}"
|
||||||
|
self.check_data(RulesHelper().dash_dash_key_time(['timestamp,30m']), data, WARNING_CODE)
|
||||||
|
self.check_data(RulesHelper().dash_dash_key_time_critical(['timestamp,1h']), data, CRITICAL_CODE)
|
||||||
|
self.check_data(RulesHelper().dash_dash_key_time(['timestamp,3h']), data, WARNING_CODE)
|
||||||
|
self.check_data(RulesHelper().dash_dash_key_time_critical(['timestamp,2d']), data, CRITICAL_CODE)
|
||||||
|
|
||||||
|
self.check_data(RulesHelper().dash_dash_key_time(['timestamp,@30m']), data, OK_CODE)
|
||||||
|
self.check_data(RulesHelper().dash_dash_key_time_critical(['timestamp,@1h']), data, OK_CODE)
|
||||||
|
self.check_data(RulesHelper().dash_dash_key_time(['timestamp,@3h']), data, OK_CODE)
|
||||||
|
self.check_data(RulesHelper().dash_dash_key_time_critical(['timestamp,@2d']), data, OK_CODE)
|
||||||
|
|
||||||
|
self.check_data(RulesHelper().dash_dash_key_time(['timestamp,-30m']), data, OK_CODE)
|
||||||
|
self.check_data(RulesHelper().dash_dash_key_time_critical(['timestamp,-1h']), data, OK_CODE)
|
||||||
|
self.check_data(RulesHelper().dash_dash_key_time(['timestamp,-3h']), data, OK_CODE)
|
||||||
|
self.check_data(RulesHelper().dash_dash_key_time_critical(['timestamp,-2d']), data, OK_CODE)
|
||||||
|
|
||||||
|
self.check_data(RulesHelper().dash_dash_key_time(['timestamp,@-30m']), data, WARNING_CODE)
|
||||||
|
self.check_data(RulesHelper().dash_dash_key_time_critical(['timestamp,@-1h']), data, CRITICAL_CODE)
|
||||||
|
self.check_data(RulesHelper().dash_dash_key_time(['timestamp,@-3h']), data, WARNING_CODE)
|
||||||
|
self.check_data(RulesHelper().dash_dash_key_time_critical(['timestamp,@-2d']), data, CRITICAL_CODE)
|
||||||
|
65
test/testdata/README.md
vendored
Normal file
65
test/testdata/README.md
vendored
Normal file
@ -0,0 +1,65 @@
|
|||||||
|
# Example Data for Testing
|
||||||
|
|
||||||
|
Example calls:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
python check_http_json.py -H localhost:8080 -p data0.json -q "age,20"
|
||||||
|
UNKNOWN: Status UNKNOWN. Could not find JSON in HTTP body.
|
||||||
|
```
|
||||||
|
|
||||||
|
```bash
|
||||||
|
python check_http_json.py -H localhost:8080 -p data1.json -e date
|
||||||
|
WARNING: Status WARNING. Key date did not exist.
|
||||||
|
|
||||||
|
python check_http_json.py -H localhost:8080 -p data1.json -E age
|
||||||
|
OK: Status OK.
|
||||||
|
|
||||||
|
python check_http_json.py -H localhost:8080 -p data1.json -w "age,30"
|
||||||
|
OK: Status OK.
|
||||||
|
|
||||||
|
python check_http_json.py -H localhost:8080 -p data1.json -w "age,20"
|
||||||
|
WARNING: Status WARNING. Value (30) for key age was outside the range 0:20.
|
||||||
|
|
||||||
|
python check_http_json.py -H localhost:8080 -p data1.json -q "age,20"
|
||||||
|
WARNING: Status WARNING. Key age mismatch. 20 != 30
|
||||||
|
```
|
||||||
|
|
||||||
|
```bash
|
||||||
|
python check_http_json.py -H localhost:8080 -p data2.json -q "(1).id,123"
|
||||||
|
WARNING: Status WARNING. Key (1).id mismatch. 123 != 2
|
||||||
|
|
||||||
|
python check_http_json.py -H localhost:8080 -p data2.json -Y "(1).id,2"
|
||||||
|
CRITICAL: Status CRITICAL. Key (1).id match found. 2 == 2
|
||||||
|
|
||||||
|
python check_http_json.py -H localhost:8080 -p data2.json -E "(1).author"
|
||||||
|
OK: Status OK.
|
||||||
|
|
||||||
|
python check_http_json.py -H localhost:8080 -p data2.json -E "(1).pages"
|
||||||
|
CRITICAL: Status CRITICAL. Key (1).pages did not exist.
|
||||||
|
```
|
||||||
|
|
||||||
|
```bash
|
||||||
|
python check_http_json.py -H localhost:8080 -p data3.json -q "company.employees.(0).role,Developer"
|
||||||
|
OK: Status OK.
|
||||||
|
|
||||||
|
python check_http_json.py -H localhost:8080 -p data3.json -q "company.employees.(0).role,Dev"
|
||||||
|
WARNING: Status WARNING. Key company.employees.(0).role mismatch. Dev != Developer
|
||||||
|
|
||||||
|
python check_http_json.py -H localhost:8080 -p data3.json -q "company.employees.(0).role,Developer" "company.employees.(1).role,Designer"
|
||||||
|
OK: Status OK.
|
||||||
|
```
|
||||||
|
|
||||||
|
```bash
|
||||||
|
python check_http_json.py -H localhost:8080 -p data4.json -u "ratings(0),4.5"
|
||||||
|
OK: Status OK.
|
||||||
|
|
||||||
|
python check_http_json.py -H localhost:8080 -p data4.json -u "ratings(0),4.1"
|
||||||
|
UNKNOWN: Status UNKNOWN. Key ratings(0) mismatch. 4.1 != 4.5
|
||||||
|
```
|
||||||
|
|
||||||
|
```bash
|
||||||
|
python check_http_json.py -H localhost:8080 -p data5.json -q service1.status,True service2.status,True service3.status,True
|
||||||
|
OK: Status OK.
|
||||||
|
|
||||||
|
python check_http_json.py -H localhost:8080 -p data5.json -q "service1.status,True" -q "service2.status,True" -q "service3.status,False"
|
||||||
|
```
|
1
test/testdata/data0-invalid.json
vendored
Normal file
1
test/testdata/data0-invalid.json
vendored
Normal file
@ -0,0 +1 @@
|
|||||||
|
No JSON
|
5
test/testdata/data1.json
vendored
Normal file
5
test/testdata/data1.json
vendored
Normal file
@ -0,0 +1,5 @@
|
|||||||
|
{
|
||||||
|
"name": "John Doe",
|
||||||
|
"age": 30,
|
||||||
|
"city": "New York"
|
||||||
|
}
|
17
test/testdata/data2.json
vendored
Normal file
17
test/testdata/data2.json
vendored
Normal file
@ -0,0 +1,17 @@
|
|||||||
|
[
|
||||||
|
{
|
||||||
|
"id": 1,
|
||||||
|
"title": "Book One",
|
||||||
|
"author": "Author One"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"id": 2,
|
||||||
|
"title": "Book Two",
|
||||||
|
"author": "Author Two"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"id": 3,
|
||||||
|
"title": "Book Three",
|
||||||
|
"author": "Author Three"
|
||||||
|
}
|
||||||
|
]
|
18
test/testdata/data3.json
vendored
Normal file
18
test/testdata/data3.json
vendored
Normal file
@ -0,0 +1,18 @@
|
|||||||
|
{
|
||||||
|
"company": {
|
||||||
|
"name": "Tech Corp",
|
||||||
|
"location": "San Francisco",
|
||||||
|
"employees": [
|
||||||
|
{
|
||||||
|
"name": "Alice",
|
||||||
|
"role": "Developer"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"name": "Bob",
|
||||||
|
"role": "Designer"
|
||||||
|
}
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"founded": 2010,
|
||||||
|
"industry": "Technology"
|
||||||
|
}
|
13
test/testdata/data4.json
vendored
Normal file
13
test/testdata/data4.json
vendored
Normal file
@ -0,0 +1,13 @@
|
|||||||
|
{
|
||||||
|
"id": 123,
|
||||||
|
"active": true,
|
||||||
|
"tags": ["tech", "startup", "innovation"],
|
||||||
|
"details": {
|
||||||
|
"website": "https://example.com",
|
||||||
|
"contact": {
|
||||||
|
"email": "info@example.com",
|
||||||
|
"phone": "+1-234-567-890"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"ratings": [4.5, 4.7, 4.8]
|
||||||
|
}
|
38
test/testdata/data5.json
vendored
Normal file
38
test/testdata/data5.json
vendored
Normal file
@ -0,0 +1,38 @@
|
|||||||
|
{
|
||||||
|
"service1": {
|
||||||
|
"status": true
|
||||||
|
},
|
||||||
|
"service2": {
|
||||||
|
"status": true,
|
||||||
|
"meta": {
|
||||||
|
"res": "PONG"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"service3": {
|
||||||
|
"status": true,
|
||||||
|
"meta": {
|
||||||
|
"took": 9,
|
||||||
|
"timed_out": false,
|
||||||
|
"_shards": {
|
||||||
|
"total": 0,
|
||||||
|
"successful": 0,
|
||||||
|
"skipped": 0,
|
||||||
|
"failed": 0
|
||||||
|
},
|
||||||
|
"hits": {
|
||||||
|
"total": {
|
||||||
|
"value": 10000,
|
||||||
|
"relation": "gte"
|
||||||
|
},
|
||||||
|
"max_score": null,
|
||||||
|
"hits": []
|
||||||
|
}
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"service4": {
|
||||||
|
"status": true,
|
||||||
|
"meta": {
|
||||||
|
"status": "ok"
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
7
test/testdata/docker-compose.yml
vendored
Normal file
7
test/testdata/docker-compose.yml
vendored
Normal file
@ -0,0 +1,7 @@
|
|||||||
|
services:
|
||||||
|
nginx:
|
||||||
|
image: nginx:1-alpine
|
||||||
|
ports:
|
||||||
|
- "8080:80"
|
||||||
|
volumes:
|
||||||
|
- ./:/usr/share/nginx/html
|
Loading…
Reference in New Issue
Block a user