158 Commits
v1.2 ... master

Author SHA1 Message Date
Markus Opolka
027af5f3f5 Merge pull request #104 from drewkerrigan/fix/multiple-flags
Fix the use of multiple similar CLI flags
2025-06-06 12:21:22 +02:00
Markus Opolka
6eef16b85b Fix the use of multiple similar CLI flags 2025-06-06 12:17:51 +02:00
Markus Opolka
ccf05d469a Merge pull request #101 from drewkerrigan/bump-ci
Bump GH Actions
2025-04-11 15:53:39 +02:00
Markus Opolka
115acc06fd Bump GH Actions 2025-04-11 15:52:16 +02:00
Markus Opolka
164632faa5 Bump dev requirements 2025-04-11 15:51:07 +02:00
Markus Opolka
039cb0adb6 Update example configuration 2025-04-11 15:45:06 +02:00
Markus Opolka
4e0d4e873b Merge pull request #99 from drewkerrigan/release-2-3-0
Release 2.3.0
2025-04-11 15:31:43 +02:00
Markus Opolka
186f081cd7 Bump version 2025-04-11 15:30:36 +02:00
Markus Opolka
e15f0f01ed Update example data 2025-04-11 14:51:54 +02:00
Markus Opolka
b61789e4a4 Merge pull request #100 from drewkerrigan/feature/no-json-state
Add cli invalid-json-state to change exit code for invalid JSON
2025-04-11 12:57:26 +02:00
Markus Opolka
c6daa09ba2 Adds a CLI flag invalid-json-state to change exit code for invalid JSON 2025-04-10 16:54:53 +02:00
Markus Opolka
3a1e7d90d0 Mention proxy variables in README 2025-04-09 17:18:34 +02:00
Markus Opolka
afb2ef7b88 Update README
- Improve structure a bit, moving the installation first and usage second
2025-04-09 17:15:19 +02:00
Markus Opolka
2a6d88bc39 Add testdata to simplify integration testing 2025-04-09 16:57:17 +02:00
Markus Opolka
2dbb38512f Merge pull request #98 from drewkerrigan/fix/improve-error-handling
Improve error handling
2025-04-09 16:46:22 +02:00
Markus Opolka
9ff11308be add unreachable-state option to icinga2 config (#97) 2025-04-09 13:01:19 +02:00
Dirk
c634ae8bb5 add unreachable-state option to icinga2 config 2025-04-09 11:51:35 +02:00
Markus Opolka
d3a2f3ed9e Improve error handling
- Added another try-catch around the CLI Rules parsing
   to make sure that users get a clean exit code and error messages
2024-07-29 11:11:41 +02:00
Markus Opolka
9d344f5a7a Add multiple key example to README 2024-07-29 09:45:30 +02:00
Markus Opolka
5c4a955abd Fix object comparision 2024-06-28 14:40:19 +02:00
aro-lew
b920a65afd Feature: Add Timestamp checks (#87) 2024-06-28 14:39:57 +02:00
Markus Opolka
d9efd1d858 Merge pull request #91 from drewkerrigan/chore/readme
Update README
2024-05-16 10:24:10 +02:00
Markus Opolka
e72030a087 Update README
- Update CLI options
2024-05-16 10:22:33 +02:00
Markus Opolka
6b51e1bb06 Merge pull request #90 from drewkerrigan/chore/update-makefile
Update makefile
2024-05-16 10:16:30 +02:00
Markus Opolka
3f73984f6b Change makefile to use python3
- Introduces a variable to override this if necessary
2024-05-16 10:14:38 +02:00
Markus Opolka
09a7ec080c Update Python versions in GitHub Actions 2024-05-16 10:14:28 +02:00
Markus Opolka
1f52898d10 Merge pull request #88 from drewkerrigan/release/v2-2-0
Bump release to v2.2.0
2024-05-14 17:02:54 +02:00
Markus Opolka
27936784c4 Bump release to v2.2.0 2024-05-14 17:01:53 +02:00
Markus Opolka
fa157753ce Merge pull request #86 from drewkerrigan/feature/verbose-http
Add flag to increase verbosity and flag to override unreachable state
2024-05-14 16:55:46 +02:00
Markus Opolka
0aceabfe91 Add verbose flag and function that can be used to enhance output more precisely
- Before we only had a boolean debug flag, good for debugging errors.
   The verbose flag can be used more precisely (`-v -vvv`) to specify when
   something should be printed. This is useful for adding more output whilst avoiding
   full debug output that contains secrets.
2024-04-09 14:08:02 +02:00
Markus Opolka
4fbb0c828a Add flag to override URL unreachable state
- I refactored the Nagios helper a bit to integrate this functionality a bit simpler.
   Before we had distinct methods on the helper that added warn,crit,unko message, now
   there's a general method that takes an int as parameter.
   This way we avoid if-else structures for the new functionality.
2024-04-09 14:07:45 +02:00
Markus Opolka
e96bba0eb8 Refactor for a leaner main function
- Also added tests for TLS options
2024-04-09 14:07:36 +02:00
Markus Opolka
d9ee817dfc Update dev-requirements 2024-04-09 14:07:29 +02:00
Markus Opolka
ce9c5fdada Merge pull request #85 from drewkerrigan/extend-tests
Extend tests for array syntax
2024-03-22 15:52:10 +01:00
Markus Opolka
27c710b2ea Extend tests for array syntax 2024-03-22 15:45:25 +01:00
Markus Opolka
dddf8432d6 Merge pull request #80 from mho21/master
disabled check_hostname to prevent error message when setting CERT_NONE
2022-10-04 16:26:06 +02:00
Markus Hof
739c093702 disabled check_hostname to prevent error message when setting CERT_NONE 2022-10-04 16:04:12 +02:00
Markus Opolka
46271c961b Bump version to 2.1.2 2022-09-15 15:25:38 +02:00
Markus Opolka
49b338bdb6 Merge pull request #79 from drewkerrigan/feature/http-method
Add CLI Flag to change HTTP method
2022-09-15 15:22:48 +02:00
Markus Opolka
9f41fc491e Add CLI flag to change HTTP method 2022-09-09 17:28:35 +02:00
Markus Opolka
3a22b712ab Fix deprecation of PROTOCOL_TLS 2022-09-09 17:26:23 +02:00
Markus Opolka
9626fc4464 Merge pull request #78 from drewkerrigan/docs/update-repo
Update Makefile and Workflows
2022-09-08 10:09:43 +02:00
Markus Opolka
c54a0040a0 Update pylint config 2022-09-08 10:08:39 +02:00
Markus Opolka
ffd96dd59f Update GitHub Workflow 2022-09-08 10:04:20 +02:00
Markus Opolka
0572c2f494 Update Makefile
- Use python from venv
2022-09-08 10:01:23 +02:00
Markus Opolka
2e6eaeea59 Merge pull request #77 from K0nne/patch-1
fix missing type conversion for --data
2022-09-08 09:59:28 +02:00
K0nne
428a5a6d3a fix missing type conversion for --data
The parameter --data is handled as type string, but the method urlopen() only accepts the datatype byte.
Before this fix you will get: "TypeError: POST data should be  bytes, an iterable of bytes, or a filer object. It cannot be of type str."
This PR solves this.
2022-07-27 13:30:25 +02:00
Markus Opolka
e3ac06864d Merge pull request #68 from ccin2p3/feature/load_default_ca_certs
[TLS] Always load system default C.A files
2021-01-22 10:39:33 +01:00
Rémi Ferrand
63542f3226 If TLS is enabled, context now loads the system default C.A files
* This allows system wide deployed C.A to be used without any further
  configuration.
2021-01-21 12:02:41 +01:00
Markus Opolka
cdb2474ee0 Update README 2020-11-24 20:27:40 +01:00
Markus Opolka
2821a1ae66 Merge pull request #66 from drewkerrigan/array-bug
Fix conditional check on empty data.
2020-09-14 10:36:14 +02:00
Markus Opolka
831bfdf97b Merge pull request #65 from alesc/patch-1
Update icinga2_check_command_definition.conf
2020-09-12 08:21:01 +02:00
alesc
f612277772 Update icinga2_check_command_definition.conf
small error in icinga2 conf definition, --key_metricS does not exist --key_metric does.
2020-09-11 10:18:02 +02:00
Markus Opolka
1f440e0ff5 Fix conditional check on empty data.
Fixes issue #64
2020-07-15 08:07:16 +02:00
Markus Opolka
c23ebac77a Merge pull request #63 from drewkerrigan/v2-1
Release 2.1
2020-07-03 09:58:55 +02:00
Markus Opolka
a014944981 Merge pull request #62 from drewkerrigan/key-equals-empty
Add handling of empty JSON return values
2020-07-03 09:04:53 +02:00
Markus Opolka
47a37556ba Merge branch 'v2-1' into key-equals-empty 2020-07-03 09:03:21 +02:00
Markus Opolka
41279cad2c Merge pull request #60 from drewkerrigan/http-json
Parse JSON on HTTPError, if JSON in response
2020-07-03 09:02:14 +02:00
Markus Opolka
f7c0472cdc Add JSON parsing on HTTPError
- Only if response contains JSON
2020-07-02 11:35:23 +02:00
Markus Opolka
25fb340bbb Add handling of empty JSON return values
- Will now throw a CRITICAL
2020-06-26 10:39:37 +02:00
Markus Opolka
47bdea7fc5 Add spaces to debug output 2020-06-26 10:31:20 +02:00
Markus Opolka
866a12ea07 Add JSON parsing on HTTPError
- Only if response contains JSON
2020-06-19 14:26:59 +02:00
Markus Opolka
d1e585b2dd Adjust pylint for new function 2020-06-19 13:16:07 +02:00
Markus Opolka
941afeed89 Move main entrypoint to own function for simpler testing 2020-06-19 13:12:51 +02:00
Markus Opolka
b9a583f281 Add Makefile and requirements.txt for easier testing 2020-06-19 13:12:48 +02:00
Markus Opolka
4c89a8a93d Update README 2020-04-03 11:53:00 +02:00
Markus Opolka
73557b3657 Merge pull request #58 from bb-Ricardo/next-release
added icinga2 command definitions
2020-04-03 11:50:29 +02:00
Ricardo Bartels
aad2376ac0 added missing cli args for V2.0 2020-04-02 08:39:42 +02:00
Markus Opolka
219e99386c Merge pull request #57 from drewkerrigan/v2.0
Release V2.0
2020-03-31 18:15:06 +02:00
Markus Opolka
0cbbf41b9c Update README 2020-03-23 09:11:22 +01:00
Markus Opolka
dd952fd571 Replace deprecated encodestring
- Fixes #56
2020-03-18 08:47:27 +01:00
Markus Opolka
83ee5062f5 [wip] Add pylint and fix pylint issues 2020-03-18 08:18:07 +01:00
Markus Opolka
c90b0323f5 Show returned JSON in OK Status when performance data is requested 2020-03-18 07:41:23 +01:00
Markus Opolka
1ac160e8c2 Add boilerplate for CLI tests 2020-03-18 07:41:23 +01:00
Markus Opolka
6fc41612c4 Add unittest for debugprint 2020-03-18 07:41:23 +01:00
Markus Opolka
f567c1ca0c Add unittest for metric key alias 2020-03-18 07:41:23 +01:00
Markus Opolka
2c98e840e8 Extend unittest coverage 2020-03-18 07:41:18 +01:00
Markus Opolka
1a9e1e9048 Add unittest for NagiosHelper 2020-03-15 09:45:18 +01:00
Markus Opolka
4f1d29dc7e Merge pull request #55 from marxin/fix-python38-warnings
Fix new Python3.8 warnings:
2020-03-12 14:17:34 +01:00
Martin Liska
404890d918 Fix new Python3.8 warnings:
./check_http_json.py:186: SyntaxWarning: "is" with a literal. Did you mean "=="?
  if elemData is (None, 'not_found'):
./check_http_json.py:189: SyntaxWarning: "is not" with a literal. Did you mean "!="?
  if subElemKey is not '':
2020-03-12 09:37:53 +01:00
Markus Opolka
e7cf7ca8fb Merge pull request #54 from drewkerrigan/fix-issue-43
Add value_separator option to specify how JSON values are being split
2020-03-09 20:39:00 +01:00
Markus Opolka
71cbd98e79 Add value_separator option to specify how JSON values are being split
- Fixes issue 43
2020-03-08 11:28:27 +01:00
Markus Opolka
5c416cd0c0 Merge pull request #53 from drewkerrigan/fix-issue-34
Add boundary check for SubArrayElement function
2020-03-05 19:51:55 +01:00
Markus Opolka
e4801227bf Add boundary check for SubArrayElement function
- Fixes Issue 34
2020-03-05 10:05:32 +01:00
Markus Opolka
b7c0b0595e Add unittest for argsparse 2020-03-03 12:12:52 +01:00
Markus Opolka
375da5d605 Add test case for key_value_list_unknown 2020-03-03 11:59:36 +01:00
Markus Opolka
95912246a2 Add and format some doc_strings 2020-03-03 11:44:55 +01:00
Markus Opolka
ba9d9b1c39 Use Python3 in GitHub Action 2020-03-03 10:04:03 +01:00
Markus Opolka
3f81e32b29 Add CI Badge to README 2020-03-03 10:02:51 +01:00
Markus Opolka
f97759f1bd Add Coverage report 2020-03-03 10:02:51 +01:00
Markus Opolka
e95daad8ff Add GitHub Action for Unit Test 2020-03-03 10:02:49 +01:00
Markus Opolka
174686a980 Move test to separat file 2020-03-03 09:48:10 +01:00
Markus Opolka
24889384b0 Add gitignore file 2020-02-16 09:55:58 +01:00
Markus Opolka
21f48681c9 Merge pull request #50 from marxin/port-to-python3
Port to Python3.
2020-02-16 09:37:00 +01:00
Markus Opolka
2196dba761 Merge pull request #51 from marxin/document-asterisk
Document syntax of array selector: (*).field_name.
2020-02-03 16:06:54 +01:00
marxin
c2435a8cbf Document syntax of array selector: (*).field_name. 2020-02-01 21:47:08 +01:00
Martin Liska
2289fb2af3 Port to Python3 (#48).
I used 2to3 script and then I clean up result of the conversion.
2020-01-28 12:13:30 +01:00
Markus Opolka
2541223cde Merge pull request #49 from drewkerrigan/fix-ssl
Fix --insecure option
2020-01-28 10:52:11 +01:00
Markus Opolka
209aaef041 Fix inconsistent use of tabs/spaces 2020-01-28 10:42:22 +01:00
Markus Opolka
65c3bd2a25 Set default context variable to -k --insecure option 2020-01-28 10:41:29 +01:00
Markus Opolka
9c0c59d6c1 Fix unittests 2020-01-28 10:22:05 +01:00
Markus Opolka
e2fce71d5a Merge pull request #46 from bb-Ricardo/master
pull improvements from different branches together
2020-01-28 10:15:49 +01:00
Ricardo Bartels
26a1b3dbe8 added icinga2 command definitions 2019-05-10 11:21:34 +02:00
Ricardo Bartels
47547951cf fixed minor bugs and added compatibility for RHEL/CentOS 7.x
* change ssl.PROTOCOL_TLS to ssl.PROTOCOL_SSLv23
* fixed bug that response var not passed outside try/except block
* fixed arrer in nagios.append_metrics()
2019-05-09 16:39:41 +02:00
Ricardo Bartels
7858382bbe Added default User-Agent header
* prevent errors for services which require this header (like Cloudflare WAF)
2019-05-09 15:53:59 +02:00
Ricardo Bartels
1173420803 updated README with current cli options 2019-05-09 15:48:29 +02:00
Ricardo Bartels
bcc36a6e95 added version information and improved help text 2019-05-09 15:44:33 +02:00
Ricardo Bartels
d98d0396b2 return more meaningful error message if parsing of data failed 2019-05-09 15:06:52 +02:00
Ricardo Bartels
8437c464e5 refine ssl insecure and client certificate options
* default TLS Protocols are now set to >= TLS1
* --cacert and --cert are no longer mandatory if option -s is used
* proper error messages if parsing of cert or key files fails
2019-05-09 14:55:25 +02:00
Ricardo Bartels
df2bbdbf51 Merge remote-tracking branch 'theicfire/master' into next-release 2019-05-09 13:38:37 +02:00
Ricardo Bartels
823fc275c9 fixed expansion on newly merged command line args 2019-05-09 13:18:34 +02:00
Ricardo Bartels
18b0898e72 Merge remote-tracking branch 'nrobert13/tg' into next-release 2019-05-09 12:39:58 +02:00
Ricardo Bartels
95318954bf fixed indentation and and print statements
* clean up from previous merges
2019-05-09 11:58:50 +02:00
Ricardo Bartels
8e469e3d98 Merge branch 'luban8' into next-release 2019-05-09 11:30:26 +02:00
Ricardo Bartels
29f8d892ee Merge branch 'ack-expand-array' into next-release 2019-05-09 11:17:51 +02:00
luban8
cbdb884dc7 Update README.md 2019-05-07 16:27:56 +02:00
luban8
3a108aef5e Update README.md 2019-05-07 16:25:00 +02:00
Martin Sura
81522fa9ab fix intedation 2019-05-07 16:23:48 +02:00
Martin Sura
27eaaf0842 Add unknown option 2019-05-07 16:15:31 +02:00
Chase Lambert
9dd6323b85 Better failure message for exact keys 2018-04-02 09:34:00 -04:00
Robert Nemeti
67136a4a2b add client ssl cert support 2018-02-15 17:04:04 +01:00
Robert Nemeti
d164a1250c add key,value non equality check, the opposite of the -q and -Q 2018-01-10 10:23:34 +01:00
Robert Nemeti
89f42c15a0 use python2.7 because on centos 6 (icinga) the default python is 2.6 and doesn;t have the required ssl libraries 2017-08-10 15:41:35 +02:00
Robert Nemeti
1e707a4b6a add repo and upstream info 2017-08-10 15:05:53 +02:00
Robert Nemeti
9656265439 print current value in the icinga message 2017-08-10 10:29:45 +02:00
Robert Nemeti
e463369671 added insecure argument for the ssl connections 2017-08-10 10:28:02 +02:00
Drew Kerrigan
357c2240ba Merge pull request #31 from thmshmm/master
fix unknown_message bug
2017-07-19 09:37:49 -07:00
Thomas Hamm
42d1e08037 fix unknown_message bug 2017-01-26 16:26:45 +00:00
Drew Kerrigan
4950225393 adding support for (*) to all flags 2016-07-19 12:43:16 -04:00
Drew Kerrigan
9be6a709a2 syntax cleanup 2016-07-19 11:02:28 -04:00
Drew Kerrigan
06fab10fe2 added (*) syntax 2016-07-18 17:22:41 -04:00
Drew Kerrigan
7bdc802c2d consistent tabs 2016-07-18 14:15:56 -04:00
Drew Kerrigan
ed7bc7175b Merge pull request #20 from berosek/master
Support for custom HTTP Headers added using -A parameter.
2016-02-23 22:11:57 +01:00
Drew Kerrigan
4180ec2066 Merge pull request #22 from artschwagerb/master
fix variable error
2016-02-23 22:10:47 +01:00
Brian Artschwager
a0d0773d1a fix variable error
NameError: global name 'critical_message' is not defined
2016-02-18 10:42:44 -05:00
Beri
fbebf05f76 Support for custom HTTP Headers added using -A parameter. 2016-01-08 16:21:00 +01:00
drewkerrigan
6f9048fc75 updating docs 2015-11-19 13:55:16 -05:00
drewkerrigan
5bb09cd362 updating docs 2015-11-19 13:55:07 -05:00
drewkerrigan
568fa6e4d0 updating docs 2015-11-19 13:54:53 -05:00
drewkerrigan
f63ac180b6 updating docs 2015-11-19 13:54:25 -05:00
drewkerrigan
070047cf55 updating docs 2015-11-19 13:52:54 -05:00
drewkerrigan
8adcf2ff07 updating docs 2015-11-19 13:49:24 -05:00
drewkerrigan
fb4e58b635 Adding support for -E, -Q, -w, -c, fixing threshold checking on -m, added UnitTest task, removed -l, -g 2015-11-18 23:07:50 -05:00
drewkerrigan
32a8884881 added ability to supply an alias for a key re: #10 2015-11-15 23:50:12 -05:00
drewkerrigan
2644151b5f adding header for nagios configuration 2015-10-05 12:40:52 -04:00
drewkerrigan
55b979f3e2 Merge branch 'master' of github.com:drewkerrigan/nagios-http-json 2015-10-05 10:36:46 -04:00
drewkerrigan
369d5115a3 extra debugging 2015-10-05 10:36:40 -04:00
Drew Kerrigan
ea7edf5d01 Merge pull request #17 from billmoritz/metrics-fix
Return metrics no matter the result
2015-08-31 12:55:17 -04:00
Bill Moritz
a4be4d42c6 Return metrics no matter the result 2015-08-31 11:36:27 -04:00
Drew Kerrigan
cb0a5927c2 Merge pull request #15 from billmoritz/http-post
Http post
2015-08-22 18:22:07 -04:00
Bill Moritz
42e75abcad Update README.md 2015-08-22 09:47:25 -04:00
Bill Moritz
3058176ba1 Add data argument
Add an option to HTTP POST data to the host.
2015-08-22 09:42:06 -04:00
Drew Kerrigan
e4334d0c4a Merge pull request #13 from nejec/master
Allow multiple key value specification
2015-07-29 11:45:04 -07:00
Jernej Porenta
b9d03c899f Allow multiple key value specification
Multiple key values can be specified by using colon delimiter.
2015-07-28 09:08:55 +02:00
Drew Kerrigan
15c5075cc1 Merge pull request #12 from MrOppermann/public-readme.me-and-script-help-are-different
synchronized readme.me and help of plugin regarding usage description
2015-07-13 17:22:42 -04:00
frederic.oppermann
1be1b2e5a2 synchronized readme.me and help of plugin regarding usage description 2015-07-13 17:05:27 +02:00
Drew Kerrigan
1772543ee3 Merge pull request #9 from invertigo/master
add arguments for http timeout and tcp port
2015-05-07 19:48:51 -04:00
root
fe2e830bf7 add arguments for http timeout and tcp port 2015-05-07 22:19:23 +00:00
28 changed files with 2517 additions and 401 deletions

31
.github/workflows/unittest.yml vendored Normal file
View File

@@ -0,0 +1,31 @@
name: CI
on:
push:
branches: [main, master]
tags:
- v*
pull_request:
jobs:
gitHubActionForPytest:
runs-on: ubuntu-latest
strategy:
matrix:
python-version: [3.8, 3.11, 3.12]
name: GitHub Action
steps:
- name: Checkout
uses: actions/checkout@v4
- name: Install dependencies
run: |
python -m pip install -r requirements-dev.txt
- name: Lint
run: |
make lint
- name: Unit Test
run: |
make test
- name: Coverage
run: |
make coverage

65
.gitignore vendored Normal file
View File

@@ -0,0 +1,65 @@
#Emacs
\#*
.\#*
# Byte-compiled / optimized / DLL files
__pycache__/
*.py[cod]
*$py.class
# Distribution / packaging
.Python
env/
build/
develop-eggs/
dist/
downloads/
eggs/
.eggs/
lib/
lib64/
parts/
sdist/
var/
*.egg-info/
.installed.cfg
*.egg
.venv/
venv/
# PyInstaller
# Usually these files are written by a python script from a template
# before PyInstaller builds the exe, so as to inject date/other infos into it.
*.manifest
*.spec
# Installer logs
pip-log.txt
pip-delete-this-directory.txt
# Unit test / coverage reports
htmlcov/
.tox/
.coverage
.coverage.*
.cache
nosetests.xml
coverage.xml
*,cover
.hypothesis/
# Translations
*.mo
*.pot
# Django stuff:
*.log
# Sphinx documentation
docs/_build/
# PyBuilder
target/
#Ipython Notebook
.ipynb_checkpoints

21
.pylintrc Normal file
View File

@@ -0,0 +1,21 @@
# pylint config
[MASTER]
ignore-patterns=^test.*
[MESSAGES CONTROL]
disable=fixme,
consider-using-f-string,
invalid-name,
line-too-long,
missing-function-docstring,
missing-module-docstring,
multiple-imports,
no-else-return,
redefined-outer-name,
superfluous-parens,
too-many-locals,
too-many-arguments,
too-many-branches,
too-many-instance-attributes,
too-many-return-statements,
too-many-statements

11
Makefile Normal file
View File

@@ -0,0 +1,11 @@
.PHONY: lint test coverage
PYTHON_PATH?=python3
lint:
$(PYTHON_PATH) -m pylint check_http_json.py
test:
$(PYTHON_PATH) -m unittest discover
coverage:
$(PYTHON_PATH) -m coverage run -m unittest discover
$(PYTHON_PATH) -m coverage report -m --include check_http_json.py

407
README.md
View File

@@ -1,13 +1,16 @@
![CI](https://github.com/drewkerrigan/nagios-http-json/workflows/CI/badge.svg)
# Nagios Json Plugin
This is a generic plugin for Nagios which checks json values from a given HTTP endpoint against argument specified rules and determines the status and performance data for that service.
### Installation
## Installation
#### Requirements
Requirements:
* Nagios
* Python
* Python 3.8+
### Nagios
Assuming a standard installation of Nagios, the plugin can be executed from the machine that Nagios is running on.
@@ -35,221 +38,265 @@ Add the following command definition to your commands config (`commands.config`)
define command{
command_name <command_name>
command_line /usr/bin/python /usr/local/nagios/libexec/plugins/check_http_json.py -H <host>:<port> -p <path> [-e|-q|-l|-g <rules>] [-m <metrics>]
command_line /usr/bin/python /usr/local/nagios/libexec/plugins/check_http_json.py -H <host>:<port> -p <path> [-e|-q|-w|-c <rules>] [-m <metrics>]
}
```
More info about options in Usage.
### Icinga2
### CLI Usage
An example Icinga2 command definition can be found here: (`contrib/icinga2_check_command_definition.conf`)
## Usage
Executing `./check_http_json.py -h` will yield the following details:
```
usage: check_http_json.py [-h] -H HOST [-B AUTH] [-p PATH]
usage: check_http_json.py [-h] [-d] [-s] -H HOST [-k] [-V] [--cacert CACERT]
[--cert CERT] [--key KEY] [-P PORT] [-p PATH]
[-t TIMEOUT] [-B AUTH] [-D DATA] [-A HEADERS]
[-f FIELD_SEPARATOR] [-F VALUE_SEPARATOR]
[-w [KEY_THRESHOLD_WARNING [KEY_THRESHOLD_WARNING ...]]]
[-c [KEY_THRESHOLD_CRITICAL [KEY_THRESHOLD_CRITICAL ...]]]
[-e [KEY_LIST [KEY_LIST ...]]]
[-E [KEY_LIST_CRITICAL [KEY_LIST_CRITICAL ...]]]
[-q [KEY_VALUE_LIST [KEY_VALUE_LIST ...]]]
[-l [KEY_LTE_LIST [KEY_LTE_LIST ...]]]
[-g [KEY_GTE_LIST [KEY_GTE_LIST ...]]]
[-m [METRIC_LIST [METRIC_LIST ...]]] [-s]
[-f SEPARATOR] [-d]
[-Q [KEY_VALUE_LIST_CRITICAL [KEY_VALUE_LIST_CRITICAL ...]]]
[-u [KEY_VALUE_LIST_UNKNOWN [KEY_VALUE_LIST_UNKNOWN ...]]]
[-y [KEY_VALUE_LIST_NOT [KEY_VALUE_LIST_NOT ...]]]
[-Y [KEY_VALUE_LIST_NOT_CRITICAL [KEY_VALUE_LIST_NOT_CRITICAL ...]]]
[-m [METRIC_LIST [METRIC_LIST ...]]]
Nagios plugin which checks json values from a given endpoint against argument
specified rules and determines the status and performance data for that
service
Check HTTP JSON Nagios Plugin
optional arguments:
Generic Nagios plugin which checks json values from a given endpoint against
argument specified rules and determines the status and performance data for
that service.
Version: 2.2.0 (2024-05-14)
options:
-h, --help show this help message and exit
-H HOST, --host HOST Host.
-d, --debug debug mode
-v, --verbose Verbose mode. Multiple -v options increase the verbosity
-s, --ssl use TLS to connect to remote host
-H HOST, --host HOST remote host to query
-k, --insecure do not check server SSL certificate
-X {GET,POST}, --request {GET,POST}
Specifies a custom request method to use when communicating with the HTTP server
-V, --version print version of this plugin
--cacert CACERT SSL CA certificate
--cert CERT SSL client certificate
--key KEY SSL client key ( if not bundled into the cert )
-P PORT, --port PORT TCP port
-p PATH, --path PATH Path
-t TIMEOUT, --timeout TIMEOUT
Connection timeout (seconds)
--unreachable-state UNREACHABLE_STATE
Exit with specified code when the URL is unreachable. Examples: 1 for Warning, 2 for Critical, 3 for Unknown (default: 3)
--invalid-json-state INVALID_JSON_STATE
Exit with specified code when no valid JSON is returned. Examples: 1 for Warning, 2 for Critical, 3 for Unknown (default: 3)
-B AUTH, --basic-auth AUTH
Basic auth string "username:password"
-p PATH, --path PATH Path.
-e [KEY_LIST [KEY_LIST ...]], --key_exists [KEY_LIST [KEY_LIST ...]]
Checks existence of these keys to determine status.
-q [KEY_VALUE_LIST [KEY_VALUE_LIST ...]], --key_equals [KEY_VALUE_LIST [KEY_VALUE_LIST ...]]
Checks equality of these keys and values (key,value
key2,value2) to determine status.
-l [KEY_LTE_LIST [KEY_LTE_LIST ...]], --key_lte [KEY_LTE_LIST [KEY_LTE_LIST ...]]
Checks that these keys and values (key,value
key2,value2) are less than or equal to the returned
json value to determine status.
-g [KEY_GTE_LIST [KEY_GTE_LIST ...]], --key_gte [KEY_GTE_LIST [KEY_GTE_LIST ...]]
Checks that these keys and values (key,value
key2,value2) are greater than or equal to the returned
json value to determine status.
-m [METRIC_LIST [METRIC_LIST ...]], --key_metric [METRIC_LIST [METRIC_LIST ...]]
Gathers the values of these keys
(key,UnitOfMeasure,Min,Max,WarnRange,CriticalRange)
for Nagios performance data. More information about
Range format and units of measure for nagios can be
found at https://nagios-
plugins.org/doc/guidelines.html Additional formats for
this parameter are: (key), (key,UnitOfMeasure),
(key,UnitOfMeasure,Min,Max).
-s, --ssl HTTPS mode.
-D DATA, --data DATA The http payload to send as a POST
-A HEADERS, --headers HEADERS
The http headers in JSON format.
-f SEPARATOR, --field_separator SEPARATOR
Json Field separator, defaults to "." ; Select element
in an array with "(" ")"
-d, --debug Debug mode.
JSON Field separator, defaults to "."; Select element in an array with "(" ")"
-F VALUE_SEPARATOR, --value_separator VALUE_SEPARATOR
JSON Value separator, defaults to ":"
-w [KEY_THRESHOLD_WARNING ...], --warning [KEY_THRESHOLD_WARNING ...]
Warning threshold for these values (key1[>alias],WarnRange key2[>alias],WarnRange). WarnRange is in the format
[@]start:end, more information at nagios-plugins.org/doc/guidelines.html.
-c [KEY_THRESHOLD_CRITICAL ...], --critical [KEY_THRESHOLD_CRITICAL ...]
Critical threshold for these values (key1[>alias],CriticalRange key2[>alias],CriticalRange. CriticalRange is in
the format [@]start:end, more information at nagios-plugins.org/doc/guidelines.html.
-e [KEY_LIST ...], --key_exists [KEY_LIST ...]
Checks existence of these keys to determine status. Return warning if key is not present.
-E [KEY_LIST_CRITICAL ...], --key_exists_critical [KEY_LIST_CRITICAL ...]
Same as -e but return critical if key is not present.
-q [KEY_VALUE_LIST ...], --key_equals [KEY_VALUE_LIST ...]
Checks equality of these keys and values (key[>alias],value key2,value2) to determine status. Multiple key values
can be delimited with colon (key,value1:value2). Return warning if equality check fails
-Q [KEY_VALUE_LIST_CRITICAL ...], --key_equals_critical [KEY_VALUE_LIST_CRITICAL ...]
Same as -q but return critical if equality check fails.
--key_time [KEY_TIME_LIST ...],
Checks a Timestamp of these keys and values
(key[>alias],value key2,value2) to determine status.
Multiple key values can be delimited with colon
(key,value1:value2). Return warning if the key is older
than the value (ex.: 30s,10m,2h,3d,...).
With at it return warning if the key is jounger
than the value (ex.: @30s,@10m,@2h,@3d,...).
With Minus you can shift the time in the future.
--key_time_critical [KEY_TIME_LIST_CRITICAL ...],
Same as --key_time but return critical if
Timestamp age fails.
-u [KEY_VALUE_LIST_UNKNOWN ...], --key_equals_unknown [KEY_VALUE_LIST_UNKNOWN ...]
Same as -q but return unknown if equality check fails.
-y [KEY_VALUE_LIST_NOT ...], --key_not_equals [KEY_VALUE_LIST_NOT ...]
Checks equality of these keys and values (key[>alias],value key2,value2) to determine status. Multiple key values
can be delimited with colon (key,value1:value2). Return warning if equality check succeeds
-Y [KEY_VALUE_LIST_NOT_CRITICAL ...], --key_not_equals_critical [KEY_VALUE_LIST_NOT_CRITICAL ...]
Same as -q but return critical if equality check succeeds.
-m [METRIC_LIST ...], --key_metric [METRIC_LIST ...]
Gathers the values of these keys (key[>alias], UnitOfMeasure,WarnRange,CriticalRange,Min,Max) for Nagios
performance data. More information about Range format and units of measure for nagios can be found at nagios-
plugins.org/doc/guidelines.html Additional formats for this parameter are: (key[>alias]),
(key[>alias],UnitOfMeasure), (key[>alias],UnitOfMeasure,WarnRange, CriticalRange).
```
Access a specific JSON field by following this syntax: `alpha.beta.gamma(3).theta.omega(0)`
Dots are field separators (changeable), parantheses are for entering arrays.
The check plugin respects the environment variables `HTTP_PROXY`, `HTTPS_PROXY`.
If the root of the JSON data is itself an array like the following:
## Examples
```
[
{ "gauges": { "jvm.buffers.direct.capacity": {"value": 215415}}}
]
```
### Key Naming
The beginning of the key should start with ($index) as in this example:
**Data for key** `value`:
```
./check_http_json.py -H localhost:8081 -p metrics --key_exists "(0)_gauges_jvm.buffers.direct.capacity_value" -f _
```
{ "value": 1000 }
**Data for key** `capacity.value`:
{
"capacity": {
"value": 1000
}
}
**Data for key** `(0).capacity.value`:
[
{
"capacity": {
"value": 1000
}
}
]
**Data for keys of all items in a list** `(*).capacity.value`:
[
{
"capacity": {
"value": 1000
}
},
{
"capacity": {
"value": 2200
}
}
]
**Data for separator** `-f _` **and key** `(0)_gauges_jvm.buffers.direct.capacity_value`:
[
{
"gauges": {
"jvm.buffers.direct.capacity":
"value": 1000
}
}
}
]
**Data for keys** `ring_members(0)`, `ring_members(1)`, `ring_members(2)`:
{
"ring_members": [
"riak1@127.0.0.1",
"riak2@127.0.0.1",
"riak3@127.0.0.1"
]
}
**Data for multiple keys for an object** `-q capacity1.value,True capacity2.value,True capacity3.value,True`
{
"capacity1": {
"value": true
},
"capacity2": {
"value": true
},
"capacity3": {
"value": true
}
}
### Thresholds and Ranges
**Data**:
{ "metric": 1000 }
#### Relevant Commands
* **Warning:** `./check_http_json.py -H <host>:<port> -p <path> -w "metric,RANGE"`
* **Critical:** `./check_http_json.py -H <host>:<port> -p <path> -c "metric,RANGE"`
* **Metrics with Warning:** `./check_http_json.py -H <host>:<port> -p <path> -w "metric,RANGE"`
* **Metrics with Critical:**
./check_http_json.py -H <host>:<port> -p <path> -w "metric,,,RANGE"
./check_http_json.py -H <host>:<port> -p <path> -w "metric,,,,MIN,MAX"
#### Range Definitions
* **Format:** [@]START:END
* **Generates a Warning or Critical if...**
* **Value is less than 0 or greater than 1000:** `1000` or `0:1000`
* **Value is greater than or equal to 1000, or less than or equal to 0:** `@1000` or `@0:1000`
* **Value is less than 1000:** `1000:`
* **Value is greater than 1000:** `~:1000`
* **Value is greater than or equal to 1000:** `@1000:`
More info about Nagios Range format and Units of Measure can be found at [https://nagios-plugins.org/doc/guidelines.html](https://nagios-plugins.org/doc/guidelines.html).
### Docker Info Example Plugin
### Timestamp
#### Description
**Data**:
Let's say we want to use `check_http_json.py` to read from Docker's `/info` HTTP API endpoint with the following parameters:
{ "metric": "2020-01-01 10:10:00.000000+00:00" }
##### Connection information
#### Relevant Commands
* Host = 127.0.0.1:4243
* Path = /info
* **Warning:** `./check_http_json.py -H <host>:<port> -p <path> --key_time "metric,TIME"`
* **Critical:** `./check_http_json.py -H <host>:<port> -p <path> --key_time_critical "metric,TIME"`
##### Rules for "aliveness"
#### TIME Definitions
* Verify that the key `Containers` exists in the outputted JSON
* Verify that the key `IPv4Forwarding` has a value of `1`
* Verify that the key `Debug` has a value less than or equal to `2`
* Verify that the key `Images` has a value greater than or equal to `1`
* If any of these criteria are not met, report a WARNING to Nagios
* **Format:** [@][-]TIME
* **Generates a Warning or Critical if...**
* **Timestamp is more than 30 seconds in the past:** `30s`
* **Timestamp is more than 5 minutes in the past:** `5m`
* **Timestamp is more than 12 hours in the past:** `12h`
* **Timestamp is more than 2 days in the past:** `2d`
* **Timestamp is more than 30 minutes in the future:** `-30m`
* **Timestamp is not more than 30 minutes in the future:** `@-30m`
* **Timestamp is not more than 30 minutes in the past:** `@30m`
##### Gather Metrics
##### Timestamp Format
* Report value of the key `Containers` with a MinValue of 0 and a MaxValue of 1000 as performance data
* Report value of the key `Images` as performance data
* Report value of the key `NEventsListener` as performance data
* Report value of the key `NFd` as performance data
* Report value of the key `NGoroutines` as performance data
* Report value of the key `SwapLimit` as performance data
This plugin uses the Python function 'datetime.fromisoformat'.
Since Python 3.11 any valid ISO 8601 format is supported, with the following exceptions:
#### Service Definition
* Time zone offsets may have fractional seconds.
* The T separator may be replaced by any single unicode character.
* Fractional hours and minutes are not supported.
* Reduced precision dates are not currently supported (YYYY-MM, YYYY).
* Extended date representations are not currently supported (±YYYYYY-MM-DD).
* Ordinal dates are not currently supported (YYYY-OOO).
`localhost.cfg`
Before Python 3.11, this method only supported the format YYYY-MM-DD
```
More info and examples the about Timestamp Format can be found at [https://docs.python.org/3/library/datetime.html#datetime.datetime.fromisoformat](https://docs.python.org/3/library/datetime.html#datetime.datetime.fromisoformat).
define service {
use local-service
host_name localhost
service_description Docker info status checker
check_command check_docker
}
#### Using Headers
```
#### Command Definition with Arguments
`commands.cfg`
```
define command{
command_name check_docker
command_line /usr/bin/python /usr/local/nagios/libexec/plugins/check_http_json.py -H 127.0.0.1:4243 -p info -e Containers -q IPv4Forwarding,1 -l Debug,2 -g Images,1 -m Containers,,0,1000 Images NEventsListener NFd NGoroutines SwapLimit
}
```
#### Sample Output
```
OK: Status OK.|'Containers'=1;0;1000 'Images'=11;0;0 'NEventsListener'=3;0;0 'NFd'=10;0;0 'NGoroutines'=14;0;0 'SwapLimit'=1;0;0
```
### Docker Container Monitor Example Plugin
`check_http_json.py` is generic enough to read and evaluate rules on any HTTP endpoint that returns JSON. In this example we'll get the status of a specific container using it's ID which camn be found by using the list containers endpoint (`curl http://127.0.0.1:4243/containers/json?all=1`).
##### Connection information
* Host = 127.0.0.1:4243
* Path = /containers/2356e8ccb3de8308ccb16cf8f5d157bc85ded5c3d8327b0dfb11818222b6f615/json
##### Rules for "aliveness"
* Verify that the key `ID` exists and is equal to the value `2356e8ccb3de8308ccb16cf8f5d157bc85ded5c3d8327b0dfb11818222b6f615`
* Verify that the key `State.Running` has a value of `True`
#### Service Definition
`localhost.cfg`
```
define service {
use local-service
host_name localhost
service_description Docker container liveness check
check_command check_my_container
}
```
#### Command Definition with Arguments
`commands.cfg`
```
define command{
command_name check_my_container
command_line /usr/bin/python /usr/local/nagios/libexec/plugins/check_http_json.py -H 127.0.0.1:4243 -p /containers/2356e8ccb3de8308ccb16cf8f5d157bc85ded5c3d8327b0dfb11818222b6f615/json -q ID,2356e8ccb3de8308ccb16cf8f5d157bc85ded5c3d8327b0dfb11818222b6f615 State.Running,True
}
```
#### Sample Output
```
WARNING: Status check failed, reason: Value True for key State.Running did not match.
```
The plugin threw a warning because the Container ID I used on my system has the following State object:
```
u'State': {...
u'Running': False,
...
```
If I change the command to have the parameter -q parameter `State.Running,False`, the output becomes:
```
OK: Status OK.
```
### Dropwizard / Fieldnames Containing '.' Example
Simply choose a separator to deal with data such as this:
```
{ "gauges": { "jvm.buffers.direct.capacity": {"value": 215415}}}
```
In this example I've chosen `_` to separate `guages` from `jvm` and `capacity` from `value`. The CLI invocation then becomes:
```
./check_http_json.py -H localhost:8081 -p metrics --key_exists gauges_jvm.buffers.direct.capacity_value -f _
```
* `./check_http_json.py -H <host>:<port> -p <path> -A '{"content-type": "application/json"}' -w "metric,RANGE"`
## License

View File

@@ -1,50 +1,100 @@
#!/usr/bin/python
#!/usr/bin/env python3
import urllib.request, urllib.error, urllib.parse
import base64
import json
import argparse
import sys
import ssl
import traceback
from urllib.error import HTTPError
from urllib.error import URLError
from datetime import datetime, timedelta, timezone
plugin_description = \
"""
Check HTTP JSON Nagios Plugin
Generic Nagios plugin which checks json values from a given endpoint against argument specified rules
and determines the status and performance data for that service.
Generic Nagios plugin which checks json values from a given endpoint against
argument specified rules and determines the status and performance data for
that service.
"""
import httplib, urllib, urllib2, base64
import json
import argparse
from pprint import pprint
from urllib2 import HTTPError
from urllib2 import URLError
OK_CODE = 0
WARNING_CODE = 1
CRITICAL_CODE = 2
UNKNOWN_CODE = 3
__version__ = '2.3.0'
__version_date__ = '2025-04-11'
class NagiosHelper:
"""Help with Nagios specific status string formatting."""
code = 0
message_prefixes = {0: 'OK', 1: 'WARNING', 2: 'CRITICAL', 3: 'UNKNOWN'}
message_text = ''
"""
Help with Nagios specific status string formatting.
"""
message_prefixes = {OK_CODE: 'OK',
WARNING_CODE: 'WARNING',
CRITICAL_CODE: 'CRITICAL',
UNKNOWN_CODE: 'UNKNOWN'}
performance_data = ''
warning_message = ''
critical_message = ''
unknown_message = ''
def getMessage(self):
"""Build a status-prefixed message with optional performance data generated externally"""
text = "%s" % self.message_prefixes[self.code]
if self.message_text:
text += ": %s" % self.message_text
def getMessage(self, message=''):
"""
Build a status-prefixed message with optional performance data
generated externally
"""
message += self.warning_message
message += self.critical_message
message += self.unknown_message
code = self.message_prefixes[self.getCode()]
output = "{code}: Status {code}. {message}".format(code=code, message=message.strip())
if self.performance_data:
text += "|%s" % self.performance_data
return text
output = "{code}: {perf_data} Status {code}. {message}|{perf_data}".format(
code=code,
message=message.strip(),
perf_data=self.performance_data)
return output.strip()
def setCodeAndMessage(self, code, text):
self.code = code
self.message_text = text
def getCode(self):
code = OK_CODE
if (self.warning_message != ''):
code = WARNING_CODE
if (self.critical_message != ''):
code = CRITICAL_CODE
if (self.unknown_message != ''):
code = UNKNOWN_CODE
return code
def append_message(self, code, msg):
if code > 2 or code < 0:
self.unknown_message += msg
if code == 1:
self.warning_message += msg
if code == 2:
self.critical_message += msg
def append_metrics(self, metrics):
(performance_data, warning_message, critical_message) = metrics
self.performance_data += performance_data
self.append_message(WARNING_CODE, warning_message)
self.append_message(CRITICAL_CODE, critical_message)
def ok(self, text): self.setCodeAndMessage(0, text)
def warning(self, text): self.setCodeAndMessage(1, text)
def critical(self, text): self.setCodeAndMessage(2, text)
def unknown(self, text): self.setCodeAndMessage(3, text)
class JsonHelper:
"""Perform simple comparison operations against values in a given JSON dict"""
def __init__(self, json_data, separator):
"""
Perform simple comparison operations against values in a given
JSON dict
"""
def __init__(self, json_data, separator, value_separator):
self.data = json_data
self.separator = separator
self.value_separator = value_separator
self.arrayOpener = '('
self.arrayCloser = ')'
@@ -54,210 +104,675 @@ class JsonHelper:
remainingKey = key[separatorIndex + 1:]
if partialKey in data:
return self.get(remainingKey, data[partialKey])
else:
return (None, 'not_found')
def getSubArrayElement(self, key, data):
subElemKey = key[:key.find(self.arrayOpener)]
index = int(key[key.find(self.arrayOpener) + 1:key.find(self.arrayCloser)])
index = int(key[key.find(self.arrayOpener) +
1:key.find(self.arrayCloser)])
remainingKey = key[key.find(self.arrayCloser + self.separator) + 2:]
if key.find(self.arrayCloser + self.separator) == -1:
remainingKey = key[key.find(self.arrayCloser) + 1:]
if subElemKey in data:
if index < len(data[subElemKey]):
return self.get(remainingKey, data[subElemKey][index])
else:
return (None, 'not_found')
if index >= len(data):
return (None, 'not_found')
else:
if not subElemKey:
return self.get(remainingKey, data[index])
else:
return (None, 'not_found')
def equals(self, key, value): return self.exists(key) and str(self.get(key)) == value
def lte(self, key, value): return self.exists(key) and float(self.get(key)) <= float(value)
def gte(self, key, value): return self.exists(key) and float(self.get(key)) >= float(value)
def exists(self, key): return (self.get(key) != (None, 'not_found'))
def equals(self, key, value):
return self.exists(key) and \
str(self.get(key)) in value.split(self.value_separator)
def lte(self, key, value):
return self.exists(key) and float(self.get(key)) <= float(value)
def lt(self, key, value):
return self.exists(key) and float(self.get(key)) < float(value)
def gte(self, key, value):
return self.exists(key) and float(self.get(key)) >= float(value)
def gt(self, key, value):
return self.exists(key) and float(self.get(key)) > float(value)
def exists(self, key):
return (self.get(key) != (None, 'not_found'))
def get(self, key, temp_data=''):
"""Can navigate nested json keys with a dot format (Element.Key.NestedKey). Returns (None, 'not_found') if not found"""
if temp_data:
"""
Can navigate nested json keys with a dot format
(Element.Key.NestedKey). Returns (None, 'not_found') if not found
"""
if temp_data != '':
data = temp_data
else:
data = self.data
if len(key) <= 0:
return data
if key.find(self.separator) != -1 and key.find(self.arrayOpener) != -1 :
if key.find(self.separator) < key.find(self.arrayOpener) :
if key.find(self.separator) != -1 and \
key.find(self.arrayOpener) != -1:
if key.find(self.separator) < key.find(self.arrayOpener):
return self.getSubElement(key, data)
else:
return self.getSubArrayElement(key, data)
else:
if key.find(self.separator) != -1 :
if key.find(self.separator) != -1:
return self.getSubElement(key, data)
else:
if key.find(self.arrayOpener) != -1 :
if key.find(self.arrayOpener) != -1:
return self.getSubArrayElement(key, data)
else:
if key in data:
if isinstance(data, dict) and key in data:
return data[key]
else:
return (None, 'not_found')
def expandKey(self, key, keys):
if '(*)' not in key:
keys.append(key)
return keys
subElemKey = ''
if key.find('(*)') > 0:
subElemKey = key[:key.find('(*)')-1]
remainingKey = key[key.find('(*)')+3:]
elemData = self.get(subElemKey)
if elemData == (None, 'not_found'):
keys.append(key)
return keys
if subElemKey != '':
subElemKey = subElemKey + '.'
for i in range(len(elemData)):
newKey = subElemKey + '(' + str(i) + ')' + remainingKey
newKeys = self.expandKey(newKey, [])
for j in newKeys:
keys.append(j)
return keys
def _getKeyAlias(original_key):
key = original_key
alias = original_key
if '>' in original_key:
keys = original_key.split('>')
if len(keys) == 2:
key, alias = keys
return key, alias
class JsonRuleProcessor:
"""Perform checks and gather values from a JSON dict given rules and metrics definitions"""
"""
Perform checks and gather values from a JSON dict given rules
and metrics definitions
"""
def __init__(self, json_data, rules_args):
self.data = json_data
self.rules = rules_args
separator = '.'
if self.rules.separator: separator = self.rules.separator
self.helper = JsonHelper(self.data, separator)
value_separator = ':'
if self.rules.separator:
separator = self.rules.separator
if self.rules.value_separator:
value_separator = self.rules.value_separator
self.helper = JsonHelper(self.data, separator, value_separator)
debugPrint(rules_args.debug, "rules: %s" % rules_args)
debugPrint(rules_args.debug, "separator: %s" % separator)
debugPrint(rules_args.debug, "value_separator: %s" % value_separator)
self.metric_list = self.expandKeys(self.rules.metric_list)
self.key_threshold_warning = self.expandKeys(
self.rules.key_threshold_warning)
self.key_threshold_critical = self.expandKeys(
self.rules.key_threshold_critical)
self.key_value_list = self.expandKeys(self.rules.key_value_list)
self.key_value_list_not = self.expandKeys(
self.rules.key_value_list_not)
self.key_time_list = self.expandKeys(self.rules.key_time_list)
self.key_list = self.expandKeys(self.rules.key_list)
self.key_value_list_critical = self.expandKeys(
self.rules.key_value_list_critical)
self.key_value_list_not_critical = self.expandKeys(
self.rules.key_value_list_not_critical)
self.key_time_list_critical = self.expandKeys(self.rules.key_time_list_critical)
self.key_list_critical = self.expandKeys(self.rules.key_list_critical)
self.key_value_list_unknown = self.expandKeys(
self.rules.key_value_list_unknown)
debugPrint(rules_args.debug, "separator:%s" % separator)
def expandKeys(self, src):
if src is None:
return []
dest = []
for key in src:
newKeys = self.helper.expandKey(key, [])
for k in newKeys:
dest.append(k)
return dest
def isAlive(self):
"""Return a tuple with liveness and reason for not liveness given existence, equality, and comparison rules"""
reason = ''
def checkExists(self, exists_list):
failure = ''
for k in exists_list:
key, alias = _getKeyAlias(k)
if (self.helper.exists(key) is False):
failure += " Key %s did not exist." % alias
return failure
if self.rules.key_list != None:
for k in self.rules.key_list:
if (self.helper.exists(k) == False):
reason += " Key %s did not exist." % k
if self.rules.key_value_list != None:
for kv in self.rules.key_value_list:
def checkEquality(self, equality_list):
failure = ''
for kv in equality_list:
k, v = kv.split(',')
if (self.helper.equals(k, v) == False):
reason += " Value %s for key %s did not match." % (v, k)
key, alias = _getKeyAlias(k)
if not self.helper.equals(key, v):
failure += " Key %s mismatch. %s != %s" % (alias, v,
self.helper.get(key))
return failure
if self.rules.key_lte_list != None:
for kv in self.rules.key_lte_list:
def checkNonEquality(self, equality_list):
failure = ''
for kv in equality_list:
k, v = kv.split(',')
if (self.helper.lte(k, v) == False):
reason += " Value %s was not less than or equal to value for key %s." % (v, k)
key, alias = _getKeyAlias(k)
if self.helper.equals(key, v):
failure += " Key %s match found. %s == %s" % (alias, v,
self.helper.get(key))
return failure
if self.rules.key_gte_list != None:
for kv in self.rules.key_gte_list:
k, v = kv.split(',')
if (self.helper.gte(k, v) == False):
reason += " Value %s was not greater than or equal to value for key %s." % (v, k)
def checkThreshold(self, key, alias, r):
failure = ''
invert = False
start = 0
end = 'infinity'
if r.startswith('@'):
invert = True
r = r[1:]
vals = r.split(':')
if len(vals) == 1:
end = vals[0]
if len(vals) == 2:
start = vals[0]
if vals[1] != '':
end = vals[1]
if(start == '~'):
if (invert and self.helper.lte(key, end)):
failure += " Value (%s) for key %s was less than or equal to %s." % \
(self.helper.get(key), alias, end)
elif (not invert and self.helper.gt(key, end)):
failure += " Value (%s) for key %s was greater than %s." % \
(self.helper.get(key), alias, end)
elif(end == 'infinity'):
if (invert and self.helper.gte(key, start)):
failure += " Value (%s) for key %s was greater than or equal to %s." % \
(self.helper.get(key), alias, start)
elif (not invert and self.helper.lt(key, start)):
failure += " Value (%s) for key %s was less than %s." % \
(self.helper.get(key), alias, start)
else:
if (invert and self.helper.gte(key, start) and
self.helper.lte(key, end)):
failure += " Value (%s) for key %s was inside the range %s:%s." % \
(self.helper.get(key), alias, start, end)
elif (not invert and (self.helper.lt(key, start) or
self.helper.gt(key, end))):
failure += " Value (%s) for key %s was outside the range %s:%s." % \
(self.helper.get(key), alias, start, end)
is_alive = (reason == '')
return failure
return (is_alive, reason)
def checkThresholds(self, threshold_list):
failure = ''
for threshold in threshold_list:
k, r = threshold.split(',')
key, alias = _getKeyAlias(k)
failure += self.checkThreshold(key, alias, r)
return failure
def checkTimestamp(self, key, alias, r):
failure = ''
invert = False
negative = False
if r.startswith('@'):
invert = True
r = r[1:]
if r.startswith('-'):
negative = True
r = r[1:]
duration = int(r[:-1])
unit = r[-1]
if unit == 's':
tiemduration = timedelta(seconds=duration)
elif unit == 'm':
tiemduration = timedelta(minutes=duration)
elif unit == 'h':
tiemduration = timedelta(hours=duration)
elif unit == 'd':
tiemduration = timedelta(days=duration)
else:
return " Value (%s) is not a vaild timeduration." % (r)
if not self.helper.exists(key):
return " Key (%s) for key %s not Exists." % \
(key, alias)
try:
timestamp = datetime.fromisoformat(self.helper.get(key))
except ValueError as ve:
return " Value (%s) for key %s is not a Date in ISO format. %s" % \
(self.helper.get(key), alias, ve)
now = datetime.now(timezone.utc)
if timestamp.tzinfo is None:
timestamp = timestamp.replace(tzinfo=timezone.utc)
age = now - timestamp
if not negative:
if age > tiemduration and not invert:
failure += " Value (%s) for key %s is older than now-%s%s." % \
(self.helper.get(key), alias, duration, unit)
if not age > tiemduration and invert:
failure += " Value (%s) for key %s is newer than now-%s%s." % \
(self.helper.get(key), alias, duration, unit)
else:
if age < -tiemduration and not invert:
failure += " Value (%s) for key %s is newer than now+%s%s." % \
(self.helper.get(key), alias, duration, unit)
if not age < -tiemduration and invert:
failure += " Value (%s) for key %s is older than now+%s%s.." % \
(self.helper.get(key), alias, duration, unit)
return failure
def checkTimestamps(self, threshold_list):
failure = ''
for threshold in threshold_list:
k, r = threshold.split(',')
key, alias = _getKeyAlias(k)
failure += self.checkTimestamp(key, alias, r)
return failure
def checkWarning(self):
failure = ''
if self.key_threshold_warning is not None:
failure += self.checkThresholds(self.key_threshold_warning)
if self.key_value_list is not None:
failure += self.checkEquality(self.key_value_list)
if self.key_value_list_not is not None:
failure += self.checkNonEquality(self.key_value_list_not)
if self.key_time_list is not None:
failure += self.checkTimestamps(self.key_time_list)
if self.key_list is not None:
failure += self.checkExists(self.key_list)
return failure
def checkCritical(self):
failure = ''
if not self.data:
failure = " Empty JSON data."
if self.key_threshold_critical is not None:
failure += self.checkThresholds(self.key_threshold_critical)
if self.key_value_list_critical is not None:
failure += self.checkEquality(self.key_value_list_critical)
if self.key_value_list_not_critical is not None:
failure += self.checkNonEquality(self.key_value_list_not_critical)
if self.key_time_list_critical is not None:
failure += self.checkTimestamps(self.key_time_list_critical)
if self.key_list_critical is not None:
failure += self.checkExists(self.key_list_critical)
return failure
def checkUnknown(self):
unknown = ''
if self.key_value_list_unknown is not None:
unknown += self.checkEquality(self.key_value_list_unknown)
return unknown
def checkMetrics(self):
"""
Return a Nagios specific performance metrics string given keys
and parameter definitions
"""
def getMetrics(self):
"""Return a Nagios specific performance metrics string given keys and parameter definitions"""
metrics = ''
if self.rules.metric_list != None:
for metric in self.rules.metric_list:
warning = ''
critical = ''
if self.metric_list is not None:
for metric in self.metric_list:
key = metric
minimum = maximum = warn_range = crit_range = 0
minimum = maximum = warn_range = crit_range = None
uom = ''
if ',' in metric:
vals = metric.split(',')
if len(vals) == 2:
key,uom = vals
key, uom = vals
if len(vals) == 4:
key,uom,minimum,maximum = vals
key, uom, warn_range, crit_range = vals
if len(vals) == 6:
key,uom,minimum,maximum,warn_range,crit_range = vals
key, uom, warn_range, crit_range, \
minimum, maximum = vals
key, alias = _getKeyAlias(key)
if self.helper.exists(key):
metrics += "'%s'=%s" % (key, self.helper.get(key))
if uom: metrics += uom
metrics += "'%s'=%s" % (alias, self.helper.get(key))
if uom:
metrics += uom
if warn_range is not None:
warning += self.checkThreshold(key, alias, warn_range)
metrics += ";%s" % warn_range
if crit_range is not None:
critical += self.checkThreshold(key, alias, crit_range)
metrics += ";%s" % crit_range
if minimum is not None:
critical += self.checkThreshold(key, alias, minimum +
':')
metrics += ";%s" % minimum
if maximum is not None:
critical += self.checkThreshold(key, alias, '~:' +
maximum)
metrics += ";%s" % maximum
if warn_range: metrics += ";%s" % warn_range
if crit_range: metrics += ";%s" % crit_range
metrics += ' '
return ("%s" % metrics, warning, critical)
return "%s" % metrics
def parseArgs(args):
"""
CLI argument definitions and parsing
"""
def parseArgs():
parser = argparse.ArgumentParser(description=
'Nagios plugin which checks json values from a given endpoint against argument specified rules\
and determines the status and performance data for that service')
parser = argparse.ArgumentParser(
description=plugin_description + '\n\nVersion: %s (%s)'
%(__version__, __version_date__),
formatter_class=argparse.RawDescriptionHelpFormatter
)
parser.add_argument('-H', '--host', dest='host', required=True, help='Host.')
parser.add_argument('-B', '--basic-auth', dest='auth', required=False, help='Basic auth string "username:password"')
parser.add_argument('-p', '--path', dest='path', help='Path.')
parser.add_argument('-d', '--debug', action='store_true',
help='debug mode')
parser.add_argument('-v', '--verbose', action='count', default=0,
help='Verbose mode. Multiple -v options increase the verbosity')
parser.add_argument('-s', '--ssl', action='store_true',
help='use TLS to connect to remote host')
parser.add_argument('-H', '--host', dest='host',
required=not ('-V' in args or '--version' in args),
help='remote host to query')
parser.add_argument('-k', '--insecure', action='store_true',
help='do not check server SSL certificate')
parser.add_argument('-X', '--request', dest='method', default='GET', choices=['GET', 'POST'],
help='Specifies a custom request method to use when communicating with the HTTP server')
parser.add_argument('-V', '--version', action='store_true',
help='print version of this plugin')
parser.add_argument('--cacert',
dest='cacert', help='SSL CA certificate')
parser.add_argument('--cert',
dest='cert', help='SSL client certificate')
parser.add_argument('--key', dest='key',
help='SSL client key ( if not bundled into the cert )')
parser.add_argument('-P', '--port', dest='port', help='TCP port')
parser.add_argument('-p', '--path', dest='path', help='Path')
parser.add_argument('-t', '--timeout', type=int,
help='Connection timeout (seconds)')
parser.add_argument('--unreachable-state', type=int, default=3,
help='Exit with specified code when the URL is unreachable. Examples: 1 for Warning, 2 for Critical, 3 for Unknown (default: 3)')
parser.add_argument('--invalid-json-state', type=int, default=3,
help='Exit with specified code when no valid JSON is returned. Examples: 1 for Warning, 2 for Critical, 3 for Unknown (default: 3)')
parser.add_argument('-B', '--basic-auth', dest='auth',
help='Basic auth string "username:password"')
parser.add_argument('-D', '--data', dest='data',
help='The http payload to send as a POST')
parser.add_argument('-A', '--headers', dest='headers',
help='The http headers in JSON format.')
parser.add_argument('-f', '--field_separator', dest='separator',
help='''JSON Field separator, defaults to ".";
Select element in an array with "(" ")"''')
parser.add_argument('-F', '--value_separator', dest='value_separator',
help='''JSON Value separator, defaults to ":"''')
parser.add_argument('-w', '--warning', dest='key_threshold_warning',
nargs='*',
help='''Warning threshold for these values
(key1[>alias],WarnRange key2[>alias],WarnRange).
WarnRange is in the format [@]start:end, more
information at
nagios-plugins.org/doc/guidelines.html.''')
parser.add_argument('-c', '--critical', dest='key_threshold_critical',
nargs='*',
help='''Critical threshold for these values
(key1[>alias],CriticalRange key2[>alias],CriticalRange.
CriticalRange is in the format [@]start:end, more
information at
nagios-plugins.org/doc/guidelines.html.''')
parser.add_argument('-e', '--key_exists', dest='key_list', nargs='*',
help='Checks existence of these keys to determine status.')
parser.add_argument('-q', '--key_equals', dest='key_value_list', nargs='*',
help='Checks equality of these keys and values (key,value key2,value2) to determine status.')
parser.add_argument('-l', '--key_lte', dest='key_lte_list', nargs='*',
help='Checks that these keys and values (key,value key2,value2) are less than or equal to\
the returned json value to determine status.')
parser.add_argument('-g', '--key_gte', dest='key_gte_list', nargs='*',
help='Checks that these keys and values (key,value key2,value2) are greater than or equal to\
the returned json value to determine status.')
parser.add_argument('-m', '--key_metric', dest='metric_list', nargs='*',
help='Gathers the values of these keys (key,UnitOfMeasure,Min,Max,WarnRange,CriticalRange) for Nagios performance data.\
More information about Range format and units of measure for nagios can be found at https://nagios-plugins.org/doc/guidelines.html\
Additional formats for this parameter are: (key), (key,UnitOfMeasure), (key,UnitOfMeasure,Min,Max).')
parser.add_argument('-s', '--ssl', action='store_true', help='HTTPS mode.')
parser.add_argument('-f', '--field_separator', dest='separator', help='Json Field separator, defaults to "." ; Select element in an array with "(" ")"')
parser.add_argument('-d', '--debug', action='store_true', help='Debug mode.')
help='''Checks existence of these keys to determine
status. Return warning if key is not present.''')
parser.add_argument('-E', '--key_exists_critical', dest='key_list_critical',
nargs='*',
help='''Same as -e but return critical if key is
not present.''')
parser.add_argument('-q', '--key_equals', dest='key_value_list',
action='extend',
nargs='*',
help='''Checks equality of these keys and values
(key[>alias],value key2,value2) to determine status.
Multiple key values can be delimited with colon
(key,value1:value2). Return warning if equality
check fails''')
parser.add_argument('-Q', '--key_equals_critical', dest='key_value_list_critical',
action='extend',
nargs='*',
help='''Same as -q but return critical if
equality check fails.''')
parser.add_argument('--key_time', dest='key_time_list', nargs='*',
help='''Checks a Timestamp of these keys and values
(key[>alias],value key2,value2) to determine status.
Multiple key values can be delimited with colon
(key,value1:value2). Return warning if the key is older
than the value (ex.: 30s,10m,2h,3d,...).
With at it return warning if the key is jounger
than the value (ex.: @30s,@10m,@2h,@3d,...).
With Minus you can shift the time in the future.''')
parser.add_argument('--key_time_critical',
dest='key_time_list_critical', nargs='*',
help='''Same as --key_time but return critical if
Timestamp age fails.''')
parser.add_argument('-u', '--key_equals_unknown',
dest='key_value_list_unknown', nargs='*',
help='''Same as -q but return unknown if
equality check fails.''')
parser.add_argument('-y', '--key_not_equals',
dest='key_value_list_not', nargs='*',
help='''Checks equality of these keys and values
(key[>alias],value key2,value2) to determine status.
Multiple key values can be delimited with colon
(key,value1:value2). Return warning if equality
check succeeds''')
parser.add_argument('-Y', '--key_not_equals_critical',
dest='key_value_list_not_critical', nargs='*',
help='''Same as -q but return critical if equality
check succeeds.''')
parser.add_argument('-m', '--key_metric', dest='metric_list',
action='extend',
nargs='*',
help='''Gathers the values of these keys (key[>alias],
UnitOfMeasure,WarnRange,CriticalRange,Min,Max) for
Nagios performance data. More information about Range
format and units of measure for nagios can be found at
nagios-plugins.org/doc/guidelines.html
Additional formats for this parameter are:
(key[>alias]), (key[>alias],UnitOfMeasure),
(key[>alias],UnitOfMeasure,WarnRange,
CriticalRange).''')
return parser.parse_args()
return parser.parse_args(args)
def debugPrint(debug_flag, message, pretty_flag=False):
if debug_flag:
if pretty_flag:
pprint(message)
else:
print message
"""Program entry point"""
if __name__ == "__main__":
args = parseArgs()
def debugPrint(debug_flag, message):
"""
Print debug messages if -d is set.
"""
if not debug_flag:
return
print(message)
def verbosePrint(verbose_flag, when, message):
"""
Print verbose messages if -v is set.
Since -v can be used multiple times, the when parameter sets the required amount before printing
"""
if not verbose_flag:
return
if verbose_flag >= when:
print(message)
def prepare_context(args):
"""
Prepare TLS Context
"""
nagios = NagiosHelper()
context = ssl.SSLContext(ssl.PROTOCOL_TLS_CLIENT)
context.options |= ssl.OP_NO_SSLv2
context.options |= ssl.OP_NO_SSLv3
if args.insecure:
context.check_hostname = False
context.verify_mode = ssl.CERT_NONE
else:
context.verify_mode = ssl.CERT_OPTIONAL
context.load_default_certs()
if args.cacert:
try:
context.load_verify_locations(args.cacert)
except ssl.SSLError:
nagios.append_message(UNKNOWN_CODE, 'Error loading SSL CA cert "%s"!' % args.cacert)
if args.cert:
try:
context.load_cert_chain(args.cert, keyfile=args.key)
except ssl.SSLError:
if args.key:
nagios.append_message(UNKNOWN_CODE, 'Error loading SSL cert. Make sure key "%s" belongs to cert "%s"!' % (args.key, args.cert))
else:
nagios.append_message(UNKNOWN_CODE, 'Error loading SSL cert. Make sure "%s" contains the key as well!' % (args.cert))
if nagios.getCode() != OK_CODE:
print(nagios.getMessage())
sys.exit(nagios.getCode())
return context
def make_request(args, url, context):
"""
Performs the actual request to the given URL
"""
req = urllib.request.Request(url, method=args.method)
req.add_header("User-Agent", "check_http_json")
if args.auth:
authbytes = str(args.auth).encode()
base64str = base64.encodebytes(authbytes).decode().replace('\n', '')
req.add_header('Authorization', 'Basic %s' % base64str)
if args.headers:
headers = json.loads(args.headers)
debugPrint(args.debug, "Headers:\n %s" % headers)
for header in headers:
req.add_header(header, headers[header])
if args.timeout and args.data:
databytes = str(args.data).encode()
response = urllib.request.urlopen(req, timeout=args.timeout,
data=databytes, context=context)
elif args.timeout:
response = urllib.request.urlopen(req, timeout=args.timeout,
context=context)
elif args.data:
databytes = str(args.data).encode()
response = urllib.request.urlopen(req, data=databytes, context=context)
else:
# pylint: disable=consider-using-with
response = urllib.request.urlopen(req, context=context)
return response.read()
def main(cliargs):
"""
Main entrypoint for CLI
"""
args = parseArgs(cliargs)
nagios = NagiosHelper()
context = None
if args.version:
print('Version: %s - Date: %s' % (__version__, __version_date__))
sys.exit(0)
if args.ssl:
url = "https://%s" % args.host
context = prepare_context(args)
else:
url = "http://%s" % args.host
if args.port:
url += ":%s" % args.port
if args.path:
url += "/%s" % args.path
if args.path: url += "/%s" % args.path
debugPrint(args.debug, "url:%s" % url)
debugPrint(args.debug, "url: %s" % url)
json_data = ''
# Attempt to reach the endpoint
try:
req = urllib2.Request(url)
if args.auth:
base64str = base64.encodestring(args.auth).replace('\n', '')
req.add_header('Authorization', 'Basic %s' % base64str)
response = urllib2.urlopen(req)
# Requesting the data from the URL
json_data = make_request(args, url, context)
except HTTPError as e:
nagios.unknown("HTTPError[%s], url:%s" % (str(e.code), url))
# Try to recover from HTTP Error, if there is JSON in the response
if "json" in e.info().get_content_subtype():
json_data = e.read()
else:
exit_code = args.invalid_json_state
nagios.append_message(exit_code, " Could not find JSON in HTTP body. HTTPError[%s], url:%s" % (str(e.code), url))
except URLError as e:
nagios.critical("URLError[%s], url:%s" % (str(e.reason), url))
# Some users might prefer another exit code if the URL wasn't reached
exit_code = args.unreachable_state
nagios.append_message(exit_code, " URLError[%s], url:%s" % (str(e.reason), url))
# Since we don't got any data, we can simply exit
print(nagios.getMessage())
sys.exit(nagios.getCode())
try:
# Loading the JSON data from the request
data = json.loads(json_data)
except ValueError as e:
exit_code = args.invalid_json_state
debugPrint(args.debug, traceback.format_exc())
nagios.append_message(exit_code, " JSON Parser error: %s" % str(e))
print(nagios.getMessage())
sys.exit(nagios.getCode())
else:
jsondata = response.read()
data = json.loads(jsondata)
verbosePrint(args.verbose, 1, json.dumps(data, indent=2))
debugPrint(args.debug, 'json:')
debugPrint(args.debug, data, True)
# Apply rules to returned JSON data
try:
# Applying rules to returned JSON data
processor = JsonRuleProcessor(data, args)
is_alive, reason = processor.isAlive()
if is_alive:
# Rules all passed, attempt to get performance data
nagios.performance_data = processor.getMetrics()
nagios.ok("Status OK.")
else:
nagios.warning("Status check failed, reason:%s" % reason)
nagios.append_message(WARNING_CODE, processor.checkWarning())
nagios.append_message(CRITICAL_CODE, processor.checkCritical())
nagios.append_metrics(processor.checkMetrics())
nagios.append_message(UNKNOWN_CODE, processor.checkUnknown())
except Exception as e: # pylint: disable=broad-exception-caught
debugPrint(args.debug, traceback.format_exc())
nagios.append_message(UNKNOWN_CODE, " Rule Parser error: %s" % str(e))
# Print Nagios specific string and exit appropriately
print nagios.getMessage()
exit(nagios.code)
print(nagios.getMessage())
sys.exit(nagios.getCode())
if __name__ == "__main__":
# Program entry point
main(sys.argv[1:])
#EOF

View File

@@ -0,0 +1,135 @@
object CheckCommand "http_json" {
// Example configuration for Icinga
import "plugin-check-command"
command = [ PluginDir + "/check_http_json.py" ]
arguments = {
"--host" = {
value = "$address$"
description = "Hostname or address of the interface to query"
required = true
}
"--port" = {
value = "$http_json_port$"
description = "TCP port number"
}
"--path" = {
value = "$http_json_path$"
description = "URL path to query (i.e.: /v1/service/xyz)"
}
"--timeout" = {
value = "$http_json_timeout$"
description = "Connection timeout (seconds)"
}
"--basic-auth" = {
value = "$http_json_basic_auth$"
description = "Basic auth string 'username:password'"
}
"--ssl" = {
set_if = "$http_json_ssl$"
description = "use TLS to connect to remote host"
}
"--insecure" = {
set_if = "$http_json_insecure$"
description = "do not check server SSL certificate"
}
"--cacert" = {
value = "$http_json_cacert_file$"
description = "path of cacert file to validate server cert"
}
"--cert" = {
value = "$http_json_cert_file$"
description = "client certificate in PEM format"
}
"--key" = {
value = "$http_json_key_file$"
description = "client certificate key file in PEM format ( if not bundled into the cert )"
}
"--data" = {
value = "$http_json_post_data$"
description = "the http payload to send as a POST"
}
"--headers" = {
value = "$http_json_headers$"
description = "additional http headers in JSON format to send with the request"
}
"--unreachable-state" = {
value = "$http_json_unreachable_state$"
description = "Exit with specified code when the URL is unreachable."
}
"--invalid-json-state" = {
value = "$http_json_invalid_json_state$"
description = "Exit with specified code when no valid JSON is returned."
"--field_separator" = {
value = "$http_json_field_separator$"
description = "JSON Field separator, defaults to '.'; Select element in an array with '(' ')'"
}
"--value_separator" = {
value = "$http_json_value_separator$"
description = "JSON Value separator, defaults to ':'"
}
"--warning" = {
value = "$http_json_warning$"
description = "Warning threshold for these values, WarningRange is in the format [@]start:end"
repeat_key = true
}
"--critical" = {
value = "$http_json_critical$"
description = "Critical threshold for these values, CriticalRange is in the format [@]start:end"
repeat_key = true
}
"--key_exists" = {
value = "$http_json_key_exists$"
description = "Checks existence of these keys to determine status. Return warning if key is not present."
repeat_key = true
}
"--key_exists_critical" = {
value = "$http_json_key_exists_critical$"
description = "Checks existence of these keys to determine status. Return critical if key is not present."
repeat_key = true
}
"--key_equals" = {
value = "$http_json_key_equals$"
description = "Checks equality of these keys and values. Return warning if equality check fails"
repeat_key = true
}
"--key_equals_critical" = {
value = "$http_json_key_equals_critical$"
description = "Checks equality of these keys and values. Return critical if equality check fails"
repeat_key = true
}
"--key_equals_unknown" = {
value = "$http_json_key_equals_unknown$"
description = "Checks equality of these keys and values. Return unknown if equality check fails"
repeat_key = true
}
"--key_not_equals" = {
value = "$http_json_key_not_equals$"
description = "Checks equality of these keys and values (key[>alias],value key2,value2) to determine status. Multiple key values can be delimited with colon (key,value1:value2). Return warning if equality check succeeds."
repeat_key = true
}
"--key_not_equals_critical" = {
value = "$http_json_key_not_equals_critical$"
description = "Checks equality of these keys and values (key[>alias],value key2,value2) to determine status. Multiple key values can be delimited with colon (key,value1:value2). Return critical if equality check succeeds."
repeat_key = true
}
"--key_metric" = {
value = "$http_json_key_metric$"
description = "Gathers the values of these keys"
repeat_key = true
}
"--key_time" = {
value = "$http_json_key_time$"
description = " Checks a Timestamp of these keys and values (key[>alias],value key2,value2) to determine status."
repeat_key = true
}
"--key_time_critical" = {
value = "$http_json_key_time_critical$"
description = "Same as --key_time but return critical if Timestamp age fails."
repeat_key = true
}
}
}

115
docs/DOCKER.md Normal file
View File

@@ -0,0 +1,115 @@
### Docker Info Example Plugin
#### Description
Let's say we want to use `check_http_json.py` to read from Docker's `/info` HTTP API endpoint with the following parameters:
##### Connection information
* Host = 127.0.0.1:4243
* Path = /info
##### Rules for "aliveness"
* Verify that the key `Containers` exists in the outputted JSON
* Verify that the key `IPv4Forwarding` has a value of `1`
* Verify that the key `Debug` has a value less than or equal to `2`
* Verify that the key `Images` has a value greater than or equal to `1`
* If any of these criteria are not met, report a WARNING to Nagios
##### Gather Metrics
* Report value of the key `Containers` with a MinValue of 0 and a MaxValue of 1000 as performance data
* Report value of the key `Images` as performance data
* Report value of the key `NEventsListener` as performance data
* Report value of the key `NFd` as performance data
* Report value of the key `NGoroutines` as performance data
* Report value of the key `SwapLimit` as performance data
#### Service Definition
`localhost.cfg`
```
define service {
use local-service
host_name localhost
service_description Docker info status checker
check_command check_docker
}
```
#### Command Definition with Arguments
`commands.cfg`
```
define command{
command_name check_docker
command_line /usr/bin/python /usr/local/nagios/libexec/plugins/check_http_json.py -H 127.0.0.1:4243 -p info -e Containers -q IPv4Forwarding,1 -w Debug,2:2 -c Images,1:1 -m Containers,0:250,0:500,0,1000 Images NEventsListener NFd NGoroutines SwapLimit
}
```
#### Sample Output
```
OK: Status OK.|'Containers'=1;0;1000 'Images'=11;0;0 'NEventsListener'=3;0;0 'NFd'=10;0;0 'NGoroutines'=14;0;0 'SwapLimit'=1;0;0
```
### Docker Container Monitor Example Plugin
`check_http_json.py` is generic enough to read and evaluate rules on any HTTP endpoint that returns JSON. In this example we'll get the status of a specific container using it's ID which camn be found by using the list containers endpoint (`curl http://127.0.0.1:4243/containers/json?all=1`).
##### Connection information
* Host = 127.0.0.1:4243
* Path = /containers/2356e8ccb3de8308ccb16cf8f5d157bc85ded5c3d8327b0dfb11818222b6f615/json
##### Rules for "aliveness"
* Verify that the key `ID` exists and is equal to the value `2356e8ccb3de8308ccb16cf8f5d157bc85ded5c3d8327b0dfb11818222b6f615`
* Verify that the key `State.Running` has a value of `True`
#### Service Definition
`localhost.cfg`
```
define service {
use local-service
host_name localhost
service_description Docker container liveness check
check_command check_my_container
}
```
#### Command Definition with Arguments
`commands.cfg`
```
define command{
command_name check_my_container
command_line /usr/bin/python /usr/local/nagios/libexec/plugins/check_http_json.py -H 127.0.0.1:4243 -p /containers/2356e8ccb3de8308ccb16cf8f5d157bc85ded5c3d8327b0dfb11818222b6f615/json -q ID,2356e8ccb3de8308ccb16cf8f5d157bc85ded5c3d8327b0dfb11818222b6f615 State.Running,True
}
```
#### Sample Output
```
WARNING: Status check failed, reason: Value True for key State.Running did not match.
```
The plugin threw a warning because the Container ID I used on my system has the following State object:
```
u'State': {...
u'Running': False,
...
```
If I change the command to have the parameter -q parameter `State.Running,False`, the output becomes:
```
OK: Status OK.
```

227
docs/RIAK.md Normal file
View File

@@ -0,0 +1,227 @@
# Riak Stats Example
## Description
For this example we're going to use `check_http_json.py` as a pure CLI tool to read Riak's `/stats` endpoint
## Connection information
* Host = 127.0.0.1:8098
* Path = /stats
## JSON Stats Data
* Full Riak HTTP Stats information can be found here: [http://docs.basho.com/riak/latest/dev/references/http/status/](http://docs.basho.com/riak/latest/dev/references/http/status/)
* Information related to specific interesting stats can be found here: [http://docs.basho.com/riak/latest/ops/running/stats-and-monitoring/](http://docs.basho.com/riak/latest/ops/running/stats-and-monitoring/)
## Connectivity Check
* `ring_members`: We can use an existence check to monitor the number of ring members
* `connected_nodes`: Similarly we can check the number of nodes that are in communication with this node, but this list will be empty in a 1 node cluster
#### Sample Command
For a single node dev "cluster", you might have a `ring_members` value like this:
```
"ring_members": [
"riak@127.0.0.1"
],
```
To validate that we have a single node, we can use this check:
```
$ ./check_http_json.py -H localhost -P 8098 -p stats -E "ring_members(0)"
OK: Status OK.
```
If we were expecting at least 2 nodes in the cluster, we would use this check:
```
$ ./check_http_json.py -H localhost -P 8098 -p stats -E "ring_members(1)"
CRITICAL: Status CRITICAL. Key ring_members(1) did not exist.
```
Obviously this fails because we only had a single `ring_member`. If we prefer to only get a warning instead of a critical for this check, we just use the correct flag:
```
$ ./check_http_json.py -H localhost -P 8098 -p stats -e "ring_members(1)"
WARNING: Status WARNING. Key ring_members(1) did not exist.
```
## Gather Metrics
The thresholds for acceptable values for these metrics will vary from system to system, following are the stats we'll be checking:
### Throughput Metrics:
* `node_gets`
* `node_puts`
* `vnode_counter_update`
* `vnode_set_update`
* `vnode_map_update`
* `search_query_throughput_one`
* `search_index_throughtput_one`
* `consistent_gets`
* `consistent_puts`
* `vnode_index_reads`
#### Sample Command
```
./check_http_json.py -H localhost -P 8098 -p stats -m \
"node_gets" \
"node_puts" \
"vnode_counter_update" \
"vnode_set_update" \
"vnode_map_update" \
"search_query_throughput_one" \
"search_index_throughtput_one" \
"consistent_gets" \
"consistent_puts" \
"vnode_index_reads"
```
#### Sample Output
```
OK: Status OK.|'node_gets'=0 'node_puts'=0 'vnode_counter_update'=0 'vnode_set_update'=0 'vnode_map_update'=0 'search_query_throughput_one'=0 'consistent_gets'=0 'consistent_puts'=0 'vnode_index_reads'=0
```
### Latency Metrics:
* `node_get_fsm_time_mean,_median,_95,_99,_100`
* `node_put_fsm_time_mean,_median,_95,_99,_100`
* `object_counter_merge_time_mean,_median,_95,_99,_100`
* `object_set_merge_time_mean,_median,_95,_99,_100`
* `object_map_merge_time_mean,_median,_95,_99,_100`
* `search_query_latency_median,_min,_95,_99,_999`
* `search_index_latency_median,_min,_95,_99,_999`
* `consistent_get_time_mean,_median,_95,_99,_100`
* `consistent_put_time_mean,_median,_95,_99,_100`
#### Sample Command
```
./check_http_json.py -H localhost -P 8098 -p stats -m \
"node_get_fsm_time_mean,,0:100,0:1000" \
"node_get_fsm_time_median,,0:100,0:1000" \
"node_get_fsm_time_95,,0:100,0:1000" \
"node_get_fsm_time_99,,0:100,0:1000" \
"node_get_fsm_time_100,,0:100,0:1000" \
"node_put_fsm_time_mean,,0:100,0:1000" \
"node_put_fsm_time_median,,0:100,0:1000" \
"node_put_fsm_time_95,,0:100,0:1000" \
"node_put_fsm_time_99,,0:100,0:1000" \
"node_put_fsm_time_100,,0:100,0:1000" \
"object_counter_merge_time_mean,,0:100,0:1000" \
"object_counter_merge_time_median,,0:100,0:1000" \
"object_counter_merge_time_95,,0:100,0:1000" \
"object_counter_merge_time_99,,0:100,0:1000" \
"object_counter_merge_time_100,,0:100,0:1000" \
"object_set_merge_time_mean,,0:100,0:1000" \
"object_set_merge_time_median,,0:100,0:1000" \
"object_set_merge_time_95,,0:100,0:1000" \
"object_set_merge_time_99,,0:100,0:1000" \
"object_set_merge_time_100,,0:100,0:1000" \
"object_map_merge_time_mean,,0:100,0:1000" \
"object_map_merge_time_median,,0:100,0:1000" \
"object_map_merge_time_95,,0:100,0:1000" \
"object_map_merge_time_99,,0:100,0:1000" \
"object_map_merge_time_100,,0:100,0:1000" \
"consistent_get_time_mean,,0:100,0:1000" \
"consistent_get_time_median,,0:100,0:1000" \
"consistent_get_time_95,,0:100,0:1000" \
"consistent_get_time_99,,0:100,0:1000" \
"consistent_get_time_100,,0:100,0:1000" \
"consistent_put_time_mean,,0:100,0:1000" \
"consistent_put_time_median,,0:100,0:1000" \
"consistent_put_time_95,,0:100,0:1000" \
"consistent_put_time_99,,0:100,0:1000" \
"consistent_put_time_100,,0:100,0:1000" \
"search_query_latency_median,,0:100,0:1000" \
"search_query_latency_min,,0:100,0:1000" \
"search_query_latency_95,,0:100,0:1000" \
"search_query_latency_99,,0:100,0:1000" \
"search_query_latency_999,,0:100,0:1000" \
"search_index_latency_median,,0:100,0:1000" \
"search_index_latency_min,,0:100,0:1000" \
"search_index_latency_95,,0:100,0:1000" \
"search_index_latency_99,,0:100,0:1000" \
"search_index_latency_999,,0:100,0:1000"
```
#### Sample Output
```
OK: Status OK.|'node_get_fsm_time_mean'=0;0:100;0:1000 'node_get_fsm_time_median'=0;0:100;0:1000 'node_get_fsm_time_95'=0;0:100;0:1000 'node_get_fsm_time_99'=0;0:100;0:1000 'node_get_fsm_time_100'=0;0:100;0:1000 'node_put_fsm_time_mean'=0;0:100;0:1000 'node_put_fsm_time_median'=0;0:100;0:1000 'node_put_fsm_time_95'=0;0:100;0:1000 'node_put_fsm_time_99'=0;0:100;0:1000 'node_put_fsm_time_100'=0;0:100;0:1000 'object_counter_merge_time_mean'=0;0:100;0:1000 'object_counter_merge_time_median'=0;0:100;0:1000 'object_counter_merge_time_95'=0;0:100;0:1000 'object_counter_merge_time_99'=0;0:100;0:1000 'object_counter_merge_time_100'=0;0:100;0:1000 'object_set_merge_time_mean'=0;0:100;0:1000 'object_set_merge_time_median'=0;0:100;0:1000 'object_set_merge_time_95'=0;0:100;0:1000 'object_set_merge_time_99'=0;0:100;0:1000 'object_set_merge_time_100'=0;0:100;0:1000 'object_map_merge_time_mean'=0;0:100;0:1000 'object_map_merge_time_median'=0;0:100;0:1000 'object_map_merge_time_95'=0;0:100;0:1000 'object_map_merge_time_99'=0;0:100;0:1000 'object_map_merge_time_100'=0;0:100;0:1000 'consistent_get_time_mean'=0;0:100;0:1000 'consistent_get_time_median'=0;0:100;0:1000 'consistent_get_time_95'=0;0:100;0:1000 'consistent_get_time_99'=0;0:100;0:1000 'consistent_get_time_100'=0;0:100;0:1000 'consistent_put_time_mean'=0;0:100;0:1000 'consistent_put_time_median'=0;0:100;0:1000 'consistent_put_time_95'=0;0:100;0:1000 'consistent_put_time_99'=0;0:100;0:1000 'consistent_put_time_100'=0;0:100;0:1000 'search_query_latency_median'=0;0:100;0:1000 'search_query_latency_min'=0;0:100;0:1000 'search_query_latency_95'=0;0:100;0:1000 'search_query_latency_99'=0;0:100;0:1000 'search_query_latency_999'=0;0:100;0:1000 'search_index_latency_median'=0;0:100;0:1000 'search_index_latency_min'=0;0:100;0:1000 'search_index_latency_95'=0;0:100;0:1000 'search_index_latency_99'=0;0:100;0:1000 'search_index_latency_999'=0;0:100;0:1000
```
### Erlang Resource Usage Metrics:
* `sys_process_count`
* `memory_processes`
* `memory_processes_used`
#### Sample Command
```
./check_http_json.py -H localhost -P 8098 -p stats -m \
"sys_process_count,,0:5000,0:10000" \
"memory_processes,,0:50000000,0:100000000" \
"memory_processes_used,,0:50000000,0:100000000"
```
#### Sample Output
```
OK: Status OK.|'sys_process_count'=1637;0:5000;0:10000 'memory_processes'=46481112;0:50000000;0:100000000 'memory_processes_used'=46476880;0:50000000;0:100000000
```
### General Riak Load / Health Metrics:
* `node_get_fsm_siblings_mean,_median,_95,_99,_100`
* `node_get_fsm_objsize_mean,_median,_95,_99,_100`
* `riak_search_vnodeq_mean,_median,_95,_99,_100`
* `search_index_fail_one`
* `pbc_active`
* `pbc_connects`
* `read_repairs`
* `list_fsm_active`
* `node_get_fsm_rejected`
* `node_put_fsm_rejected`
#### Sample Command
```
./check_http_json.py -H localhost -P 8098 -p stats -m \
"node_get_fsm_siblings_mean,,0:100,0:1000" \
"node_get_fsm_siblings_median,,0:100,0:1000" \
"node_get_fsm_siblings_95,,0:100,0:1000" \
"node_get_fsm_siblings_99,,0:100,0:1000" \
"node_get_fsm_siblings_100,,0:100,0:1000" \
"node_get_fsm_objsize_mean,,0:100,0:1000" \
"node_get_fsm_objsize_median,,0:100,0:1000" \
"node_get_fsm_objsize_95,,0:100,0:1000" \
"node_get_fsm_objsize_99,,0:100,0:1000" \
"node_get_fsm_objsize_100,,0:100,0:1000" \
"riak_search_vnodeq_mean,,0:100,0:1000" \
"riak_search_vnodeq_median,,0:100,0:1000" \
"riak_search_vnodeq_95,,0:100,0:1000" \
"riak_search_vnodeq_99,,0:100,0:1000" \
"riak_search_vnodeq_100,,0:100,0:1000" \
"search_index_fail_one,,0:100,0:1000" \
"pbc_active,,0:100,0:1000" \
"pbc_connects,,0:100,0:1000" \
"read_repairs,,0:100,0:1000" \
"list_fsm_active,,0:100,0:1000" \
"node_get_fsm_rejected,,0:100,0:1000" \
"node_put_fsm_rejected,,0:100,0:1000"
```
#### Sample Output
```
OK: Status OK.|'node_get_fsm_siblings_mean'=0;0:100;0:1000 'node_get_fsm_siblings_median'=0;0:100;0:1000 'node_get_fsm_siblings_95'=0;0:100;0:1000 'node_get_fsm_siblings_99'=0;0:100;0:1000 'node_get_fsm_siblings_100'=0;0:100;0:1000 'node_get_fsm_objsize_mean'=0;0:100;0:1000 'node_get_fsm_objsize_median'=0;0:100;0:1000 'node_get_fsm_objsize_95'=0;0:100;0:1000 'node_get_fsm_objsize_99'=0;0:100;0:1000 'node_get_fsm_objsize_100'=0;0:100;0:1000 'search_index_fail_one'=0;0:100;0:1000 'pbc_active'=0;0:100;0:1000 'pbc_connects'=0;0:100;0:1000 'read_repairs'=0;0:100;0:1000 'list_fsm_active'=0;0:100;0:1000 'node_get_fsm_rejected'=0;0:100;0:1000 'node_put_fsm_rejected'=0;0:100;0:1000
```

2
requirements-dev.txt Normal file
View File

@@ -0,0 +1,2 @@
coverage==7.8.0
pylint==3.3.6

0
test/__init__.py Normal file
View File

2
test/requirements.txt Normal file
View File

@@ -0,0 +1,2 @@
coverage==5.0.3
pylint==2.4.4

34
test/test_args.py Normal file
View File

@@ -0,0 +1,34 @@
#!/usr/bin/env python3
import unittest
import sys
sys.path.append('..')
from check_http_json import *
class ArgsTest(unittest.TestCase):
"""
Tests for argsparse
"""
def test_parser_defaults(self):
parser = parseArgs(['-H', 'foobar'])
self.assertFalse(parser.debug)
self.assertFalse(parser.ssl)
self.assertFalse(parser.insecure)
def test_parser_with_debug(self):
parser = parseArgs(['-H', 'foobar', '-d'])
self.assertTrue(parser.debug)
def test_parser_with_port(self):
parser = parseArgs(['-H', 'foobar', '-P', '8888'])
self.assertEqual(parser.port, '8888')
def test_parser_with_separator(self):
parser = parseArgs(['-H', 'foobar', '-f', '_', '-F', '_'])
self.assertEqual(parser.separator, '_')
self.assertEqual(parser.value_separator, '_')

View File

@@ -0,0 +1,450 @@
#!/usr/bin/env python3
import json
import unittest
from unittest.mock import patch
import sys
sys.path.append('..')
from check_http_json import *
OK_CODE = 0
WARNING_CODE = 1
CRITICAL_CODE = 2
UNKNOWN_CODE = 3
class RulesHelper:
separator = '.'
value_separator = ':'
debug = False
key_threshold_warning = None
key_value_list = None
key_value_list_not = None
key_list = None
key_threshold_critical = None
key_value_list_critical = None
key_value_list_not_critical = None
key_time_list = None
key_time_list_critical = None
key_value_list_unknown = None
key_list_critical = None
metric_list = None
def dash_m(self, data):
self.metric_list = data
return self
def dash_e(self, data):
self.key_list = data
return self
def dash_E(self, data):
self.key_list_critical = data
return self
def dash_q(self, data):
self.key_value_list = data
return self
def dash_Q(self, data):
self.key_value_list_critical = data
return self
def dash_y(self, data):
self.key_value_list_not = data
return self
def dash_Y(self, data):
self.key_value_list_not_critical = data
return self
def dash_U(self, data):
self.key_value_list_unknown = data
return self
def dash_w(self, data):
self.key_threshold_warning = data
return self
def dash_c(self, data):
self.key_threshold_critical = data
return self
def dash_dash_key_time(self, data):
self.key_time_list = data
return self
def dash_dash_key_time_critical(self, data):
self.key_time_list_critical = data
return self
class UtilTest(unittest.TestCase):
"""
Tests for the util fucntions
"""
rules = RulesHelper()
def check_data(self, args, jsondata, code):
data = json.loads(jsondata)
nagios = NagiosHelper()
processor = JsonRuleProcessor(data, args)
nagios.append_message(WARNING_CODE, processor.checkWarning())
nagios.append_message(CRITICAL_CODE, processor.checkCritical())
nagios.append_metrics(processor.checkMetrics())
nagios.append_message(UNKNOWN_CODE, processor.checkUnknown())
self.assertEqual(code, nagios.getCode())
def test_metrics(self):
self.check_data(RulesHelper().dash_m(['metric,,1:4,1:5']),
'{"metric": 5}', WARNING_CODE)
self.check_data(RulesHelper().dash_m(['metric,,1:5,1:4']),
'{"metric": 5}', CRITICAL_CODE)
self.check_data(RulesHelper().dash_m(['metric,,1:5,1:5,6,10']),
'{"metric": 5}', CRITICAL_CODE)
self.check_data(RulesHelper().dash_m(['metric,,1:5,1:5,1,4']),
'{"metric": 5}', CRITICAL_CODE)
self.check_data(RulesHelper().dash_m(['metric,s,@1:4,@6:10,1,10']),
'{"metric": 5}', OK_CODE)
self.check_data(RulesHelper().dash_m(['(*).value,s,1:5,1:5']),
'[{"value": 5},{"value": 100}]', CRITICAL_CODE)
self.check_data(RulesHelper().dash_m(['metric>foobar,,1:4,1:5']),
'{"metric": 5}', WARNING_CODE)
def test_unknown(self):
self.check_data(RulesHelper().dash_U(['metric,0']),
'{"metric": 3}', UNKNOWN_CODE)
def test_array(self):
self.check_data(RulesHelper().dash_q(['foo(0),bar']),
'{"foo": ["bar"]}', OK_CODE)
self.check_data(RulesHelper().dash_q(['foo(0),foo']),
'{"foo": ["bar"]}', WARNING_CODE)
self.check_data(RulesHelper().dash_Q(['foo(1),bar']),
'{"foo": ["bar"]}', CRITICAL_CODE)
def test_exists(self):
self.check_data(RulesHelper().dash_e(['nothere']),
'{"metric": 5}', WARNING_CODE)
self.check_data(RulesHelper().dash_E(['nothere']),
'{"metric": 5}', CRITICAL_CODE)
self.check_data(RulesHelper().dash_e(['metric']),
'{"metric": 5}', OK_CODE)
def test_equality(self):
self.check_data(RulesHelper().dash_q(['metric,6']),
'{"metric": 5}', WARNING_CODE)
self.check_data(RulesHelper().dash_Q(['metric,6']),
'{"metric": 5}', CRITICAL_CODE)
self.check_data(RulesHelper().dash_q(['metric,5']),
'{"metric": 5}', OK_CODE)
def test_equality_colon(self):
"""
See https://github.com/drewkerrigan/nagios-http-json/issues/43
"""
rules = RulesHelper()
rules.value_separator = '_'
# This should not fail
self.check_data(rules.dash_q(['metric,foo:bar']),
'{"metric": "foo:bar"}', OK_CODE)
def test_non_equality(self):
self.check_data(RulesHelper().dash_y(['metric,6']),
'{"metric": 6}', WARNING_CODE)
self.check_data(RulesHelper().dash_Y(['metric,6']),
'{"metric": 6}', CRITICAL_CODE)
self.check_data(RulesHelper().dash_y(['metric,5']),
'{"metric": 6}', OK_CODE)
def test_warning_thresholds(self):
self.check_data(RulesHelper().dash_w(['metric,5']),
'{"metric": 5}', OK_CODE)
self.check_data(RulesHelper().dash_w(['metric,5:']),
'{"metric": 5}', OK_CODE)
self.check_data(RulesHelper().dash_w(['metric,~:5']),
'{"metric": 5}', OK_CODE)
self.check_data(RulesHelper().dash_w(['metric,1:5']),
'{"metric": 5}', OK_CODE)
self.check_data(RulesHelper().dash_w(['metric,@5']),
'{"metric": 6}', OK_CODE)
self.check_data(RulesHelper().dash_w(['metric,@5:']),
'{"metric": 4}', OK_CODE)
self.check_data(RulesHelper().dash_w(['metric,@~:5']),
'{"metric": 6}', OK_CODE)
self.check_data(RulesHelper().dash_w(['metric,@1:5']),
'{"metric": 6}', OK_CODE)
self.check_data(RulesHelper().dash_w(['metric,5']),
'{"metric": 6}', WARNING_CODE)
self.check_data(RulesHelper().dash_w(['metric,5:']),
'{"metric": 4}', WARNING_CODE)
self.check_data(RulesHelper().dash_w(['metric,~:5']),
'{"metric": 6}', WARNING_CODE)
self.check_data(RulesHelper().dash_w(['metric,1:5']),
'{"metric": 6}', WARNING_CODE)
self.check_data(RulesHelper().dash_w(['metric,@5']),
'{"metric": 5}', WARNING_CODE)
self.check_data(RulesHelper().dash_w(['metric,@5:']),
'{"metric": 5}', WARNING_CODE)
self.check_data(RulesHelper().dash_w(['metric,@~:5']),
'{"metric": 5}', WARNING_CODE)
self.check_data(RulesHelper().dash_w(['metric,@1:5']),
'{"metric": 5}', WARNING_CODE)
self.check_data(RulesHelper().dash_w(['(*).value,@1:5']),
'[{"value": 5},{"value": 1000}]', WARNING_CODE)
def test_critical_thresholds(self):
self.check_data(RulesHelper().dash_c(['metric,5']),
'{"metric": 5}', OK_CODE)
self.check_data(RulesHelper().dash_c(['metric,5:']),
'{"metric": 5}', OK_CODE)
self.check_data(RulesHelper().dash_c(['metric,~:5']),
'{"metric": 5}', OK_CODE)
self.check_data(RulesHelper().dash_c(['metric,1:5']),
'{"metric": 5}', OK_CODE)
self.check_data(RulesHelper().dash_c(['metric,@5']),
'{"metric": 6}', OK_CODE)
self.check_data(RulesHelper().dash_c(['metric,@5:']),
'{"metric": 4}', OK_CODE)
self.check_data(RulesHelper().dash_c(['metric,@~:5']),
'{"metric": 6}', OK_CODE)
self.check_data(RulesHelper().dash_c(['metric,@1:5']),
'{"metric": 6}', OK_CODE)
self.check_data(RulesHelper().dash_c(['metric,5']),
'{"metric": 6}', CRITICAL_CODE)
self.check_data(RulesHelper().dash_c(['metric,5:']),
'{"metric": 4}', CRITICAL_CODE)
self.check_data(RulesHelper().dash_c(['metric,~:5']),
'{"metric": 6}', CRITICAL_CODE)
self.check_data(RulesHelper().dash_c(['metric,1:5']),
'{"metric": 6}', CRITICAL_CODE)
self.check_data(RulesHelper().dash_c(['metric,@5']),
'{"metric": 5}', CRITICAL_CODE)
self.check_data(RulesHelper().dash_c(['metric,@5:']),
'{"metric": 5}', CRITICAL_CODE)
self.check_data(RulesHelper().dash_c(['metric,@~:5']),
'{"metric": 5}', CRITICAL_CODE)
self.check_data(RulesHelper().dash_c(['metric,@1:5']),
'{"metric": 5}', CRITICAL_CODE)
self.check_data(RulesHelper().dash_c(['(*).value,@1:5']),
'[{"value": 5},{"value": 1000}]', CRITICAL_CODE)
def test_separator(self):
rules = RulesHelper()
rules.separator = '_'
self.check_data(
rules.dash_q(
['(0)_gauges_jvm.buffers.direct.capacity(1)_value,1234']),
'''[{ "gauges": { "jvm.buffers.direct.capacity": [
{"value": 215415},{"value": 1234}]}}]''',
OK_CODE)
self.check_data(
rules.dash_q(
['(*)_gauges_jvm.buffers.direct.capacity(1)_value,1234']),
'''[{ "gauges": { "jvm.buffers.direct.capacity": [
{"value": 215415},{"value": 1234}]}},
{ "gauges": { "jvm.buffers.direct.capacity": [
{"value": 215415},{"value": 1235}]}}]''',
WARNING_CODE)
def test_array_with_missing_element(self):
"""
See https://github.com/drewkerrigan/nagios-http-json/issues/34
"""
rules = RulesHelper()
# This should simply work
data = '[{"Node": "there"}]'
self.check_data(rules.dash_q(['(0).Node,there']), data, OK_CODE)
# This should warn us
data = '[{"Node": "othervalue"}]'
self.check_data(rules.dash_q(['(0).Node,there']), data, WARNING_CODE)
# # This should not throw an IndexError
data = '[{"Node": "foobar"}]'
self.check_data(rules.dash_q(['(0).Node,foobar', '(1).Node,missing']), data, WARNING_CODE)
self.check_data(rules.dash_q(['(0).Node,foobar', '(1).Node,missing', '(2).Node,alsomissing']), data, WARNING_CODE)
# This should not throw a KeyError
data = '{}'
self.check_data(rules.dash_q(['(0).Node,foobar', '(1).Node,missing']), data, CRITICAL_CODE)
def test_subelem(self):
rules = RulesHelper()
data = '{"foo": {"foo": {"foo": "bar"}}}'
self.check_data(rules.dash_E(['foo.foo.foo.foo.foo']), data, CRITICAL_CODE)
def test_subarrayelem_missing_elem(self):
rules = RulesHelper()
data = '[{"capacity": {"value": 1000}},{"capacity": {"value": 2200}}]'
self.check_data(rules.dash_E(['(*).capacity.value']), data, OK_CODE)
self.check_data(rules.dash_E(['(*).capacity.value.too_deep']), data, CRITICAL_CODE)
# Should not throw keyerror
self.check_data(rules.dash_E(['foo']), data, CRITICAL_CODE)
def test_empty_key_value_array(self):
"""
https://github.com/drewkerrigan/nagios-http-json/issues/61
"""
rules = RulesHelper()
# This should simply work
data = '[{"update_status": "finished"},{"update_status": "finished"}]'
self.check_data(rules.dash_q(['(*).update_status,finished']), data, OK_CODE)
# This should warn us
data = '[{"update_status": "finished"},{"update_status": "failure"}]'
self.check_data(rules.dash_q(['(*).update_status,finished']), data, WARNING_CODE)
# This should throw an error
data = '[]'
self.check_data(rules.dash_q(['(*).update_status,warn_me']), data, CRITICAL_CODE)
def test_key_time(self):
if sys.version_info[1] >= 11:
# Test current timestamp.
now = datetime.now(timezone.utc)
data = "{\"timestamp\": \"%s\",\"timestamp2\": \"%s\"}" % (now, now)
self.check_data(RulesHelper().dash_dash_key_time(['timestamp,30s', 'timestamp2,30s']), data, OK_CODE)
self.check_data(RulesHelper().dash_dash_key_time(['timestamp,30m']), data, OK_CODE)
self.check_data(RulesHelper().dash_dash_key_time_critical(['timestamp,1h']), data, OK_CODE)
self.check_data(RulesHelper().dash_dash_key_time(['timestamp,3h']), data, OK_CODE)
self.check_data(RulesHelper().dash_dash_key_time_critical(['timestamp,2d']), data, OK_CODE)
self.check_data(RulesHelper().dash_dash_key_time(['timestamp,@30m']), data, WARNING_CODE)
self.check_data(RulesHelper().dash_dash_key_time_critical(['timestamp,@1h']), data, CRITICAL_CODE)
self.check_data(RulesHelper().dash_dash_key_time(['timestamp,@3h']), data, WARNING_CODE)
self.check_data(RulesHelper().dash_dash_key_time_critical(['timestamp,@2d']), data, CRITICAL_CODE)
self.check_data(RulesHelper().dash_dash_key_time(['timestamp,-30m']), data, OK_CODE)
self.check_data(RulesHelper().dash_dash_key_time_critical(['timestamp,-1h']), data, OK_CODE)
self.check_data(RulesHelper().dash_dash_key_time(['timestamp,-3h']), data, OK_CODE)
self.check_data(RulesHelper().dash_dash_key_time_critical(['timestamp,-2d']), data, OK_CODE)
self.check_data(RulesHelper().dash_dash_key_time(['timestamp,@-30m']), data, WARNING_CODE)
self.check_data(RulesHelper().dash_dash_key_time_critical(['timestamp,@-1h']), data, CRITICAL_CODE)
self.check_data(RulesHelper().dash_dash_key_time(['timestamp,@-3h']), data, WARNING_CODE)
self.check_data(RulesHelper().dash_dash_key_time_critical(['timestamp,@-2d']), data, CRITICAL_CODE)
# Test 31 minute in the past.
data = "{\"timestamp\": \"%s\"}" % (now - timedelta(minutes=31))
self.check_data(RulesHelper().dash_dash_key_time(['timestamp,30m']), data, WARNING_CODE)
self.check_data(RulesHelper().dash_dash_key_time_critical(['timestamp,1h']), data, OK_CODE)
self.check_data(RulesHelper().dash_dash_key_time(['timestamp,3h']), data, OK_CODE)
self.check_data(RulesHelper().dash_dash_key_time_critical(['timestamp,2d']), data, OK_CODE)
self.check_data(RulesHelper().dash_dash_key_time(['timestamp,@30m']), data, OK_CODE)
self.check_data(RulesHelper().dash_dash_key_time_critical(['timestamp,@1h']), data, CRITICAL_CODE)
self.check_data(RulesHelper().dash_dash_key_time(['timestamp,@3h']), data, WARNING_CODE)
self.check_data(RulesHelper().dash_dash_key_time_critical(['timestamp,@2d']), data, CRITICAL_CODE)
self.check_data(RulesHelper().dash_dash_key_time(['timestamp,-30m']), data, OK_CODE)
self.check_data(RulesHelper().dash_dash_key_time_critical(['timestamp,-1h']), data, OK_CODE)
self.check_data(RulesHelper().dash_dash_key_time(['timestamp,-3h']), data, OK_CODE)
self.check_data(RulesHelper().dash_dash_key_time_critical(['timestamp,-2d']), data, OK_CODE)
self.check_data(RulesHelper().dash_dash_key_time(['timestamp,@-30m']), data, WARNING_CODE)
self.check_data(RulesHelper().dash_dash_key_time_critical(['timestamp,@-1h']), data, CRITICAL_CODE)
self.check_data(RulesHelper().dash_dash_key_time(['timestamp,@-3h']), data, WARNING_CODE)
self.check_data(RulesHelper().dash_dash_key_time_critical(['timestamp,@-2d']), data, CRITICAL_CODE)
# Test two hours and one minute in the past.
data = "{\"timestamp\": \"%s\"}" % (now - timedelta(hours=2) - timedelta(minutes=1))
self.check_data(RulesHelper().dash_dash_key_time(['timestamp,30m']), data, WARNING_CODE)
self.check_data(RulesHelper().dash_dash_key_time_critical(['timestamp,1h']), data, CRITICAL_CODE)
self.check_data(RulesHelper().dash_dash_key_time(['timestamp,3h']), data, OK_CODE)
self.check_data(RulesHelper().dash_dash_key_time_critical(['timestamp,2d']), data, OK_CODE)
self.check_data(RulesHelper().dash_dash_key_time(['timestamp,@30m']), data, OK_CODE)
self.check_data(RulesHelper().dash_dash_key_time_critical(['timestamp,@1h']), data, OK_CODE)
self.check_data(RulesHelper().dash_dash_key_time(['timestamp,@3h']), data, WARNING_CODE)
self.check_data(RulesHelper().dash_dash_key_time_critical(['timestamp,@2d']), data, CRITICAL_CODE)
self.check_data(RulesHelper().dash_dash_key_time(['timestamp,-30m']), data, OK_CODE)
self.check_data(RulesHelper().dash_dash_key_time_critical(['timestamp,-1h']), data, OK_CODE)
self.check_data(RulesHelper().dash_dash_key_time(['timestamp,-3h']), data, OK_CODE)
self.check_data(RulesHelper().dash_dash_key_time_critical(['timestamp,-2d']), data, OK_CODE)
self.check_data(RulesHelper().dash_dash_key_time(['timestamp,@-30m']), data, WARNING_CODE)
self.check_data(RulesHelper().dash_dash_key_time_critical(['timestamp,@-1h']), data, CRITICAL_CODE)
self.check_data(RulesHelper().dash_dash_key_time(['timestamp,@-3h']), data, WARNING_CODE)
self.check_data(RulesHelper().dash_dash_key_time_critical(['timestamp,@-2d']), data, CRITICAL_CODE)
# Test one day and one minute in the past.
data = "{\"timestamp\": \"%s\"}" % (now - timedelta(days=1) - timedelta(minutes=1))
self.check_data(RulesHelper().dash_dash_key_time(['timestamp,30m']), data, WARNING_CODE)
self.check_data(RulesHelper().dash_dash_key_time_critical(['timestamp,1h']), data, CRITICAL_CODE)
self.check_data(RulesHelper().dash_dash_key_time(['timestamp,3h']), data, WARNING_CODE)
self.check_data(RulesHelper().dash_dash_key_time_critical(['timestamp,2d']), data, OK_CODE)
self.check_data(RulesHelper().dash_dash_key_time(['timestamp,@30m']), data, OK_CODE)
self.check_data(RulesHelper().dash_dash_key_time_critical(['timestamp,@1h']), data, OK_CODE)
self.check_data(RulesHelper().dash_dash_key_time(['timestamp,@3h']), data, OK_CODE)
self.check_data(RulesHelper().dash_dash_key_time_critical(['timestamp,@2d']), data, CRITICAL_CODE)
self.check_data(RulesHelper().dash_dash_key_time(['timestamp,-30m']), data, OK_CODE)
self.check_data(RulesHelper().dash_dash_key_time_critical(['timestamp,-1h']), data, OK_CODE)
self.check_data(RulesHelper().dash_dash_key_time(['timestamp,-3h']), data, OK_CODE)
self.check_data(RulesHelper().dash_dash_key_time_critical(['timestamp,-2d']), data, OK_CODE)
self.check_data(RulesHelper().dash_dash_key_time(['timestamp,@-30m']), data, WARNING_CODE)
self.check_data(RulesHelper().dash_dash_key_time_critical(['timestamp,@-1h']), data, CRITICAL_CODE)
self.check_data(RulesHelper().dash_dash_key_time(['timestamp,@-3h']), data, WARNING_CODE)
self.check_data(RulesHelper().dash_dash_key_time_critical(['timestamp,@-2d']), data, CRITICAL_CODE)
# Test two hours and one minute in the future.
data = "{\"timestamp\": \"%s\"}" % (now + timedelta(hours=2) + timedelta(minutes=1))
self.check_data(RulesHelper().dash_dash_key_time(['timestamp,30m']), data, OK_CODE)
self.check_data(RulesHelper().dash_dash_key_time_critical(['timestamp,1h']), data, OK_CODE)
self.check_data(RulesHelper().dash_dash_key_time(['timestamp,3h']), data, OK_CODE)
self.check_data(RulesHelper().dash_dash_key_time_critical(['timestamp,2d']), data, OK_CODE)
self.check_data(RulesHelper().dash_dash_key_time(['timestamp,@30m']), data, WARNING_CODE)
self.check_data(RulesHelper().dash_dash_key_time_critical(['timestamp,@1h']), data, CRITICAL_CODE)
self.check_data(RulesHelper().dash_dash_key_time(['timestamp,@3h']), data, WARNING_CODE)
self.check_data(RulesHelper().dash_dash_key_time_critical(['timestamp,@2d']), data, CRITICAL_CODE)
self.check_data(RulesHelper().dash_dash_key_time(['timestamp,-30m']), data, WARNING_CODE)
self.check_data(RulesHelper().dash_dash_key_time_critical(['timestamp,-1h']), data, CRITICAL_CODE)
self.check_data(RulesHelper().dash_dash_key_time(['timestamp,-3h']), data, OK_CODE)
self.check_data(RulesHelper().dash_dash_key_time_critical(['timestamp,-2d']), data, OK_CODE)
self.check_data(RulesHelper().dash_dash_key_time(['timestamp,@-30m']), data, OK_CODE)
self.check_data(RulesHelper().dash_dash_key_time_critical(['timestamp,@-1h']), data, OK_CODE)
self.check_data(RulesHelper().dash_dash_key_time(['timestamp,@-3h']), data, WARNING_CODE)
self.check_data(RulesHelper().dash_dash_key_time_critical(['timestamp,@-2d']), data, CRITICAL_CODE)
else:
data = "{\"timestamp\": \"2020-01-01\"}"
self.check_data(RulesHelper().dash_dash_key_time(['timestamp,30m']), data, WARNING_CODE)
self.check_data(RulesHelper().dash_dash_key_time_critical(['timestamp,1h']), data, CRITICAL_CODE)
self.check_data(RulesHelper().dash_dash_key_time(['timestamp,3h']), data, WARNING_CODE)
self.check_data(RulesHelper().dash_dash_key_time_critical(['timestamp,2d']), data, CRITICAL_CODE)
self.check_data(RulesHelper().dash_dash_key_time(['timestamp,@30m']), data, OK_CODE)
self.check_data(RulesHelper().dash_dash_key_time_critical(['timestamp,@1h']), data, OK_CODE)
self.check_data(RulesHelper().dash_dash_key_time(['timestamp,@3h']), data, OK_CODE)
self.check_data(RulesHelper().dash_dash_key_time_critical(['timestamp,@2d']), data, OK_CODE)
self.check_data(RulesHelper().dash_dash_key_time(['timestamp,-30m']), data, OK_CODE)
self.check_data(RulesHelper().dash_dash_key_time_critical(['timestamp,-1h']), data, OK_CODE)
self.check_data(RulesHelper().dash_dash_key_time(['timestamp,-3h']), data, OK_CODE)
self.check_data(RulesHelper().dash_dash_key_time_critical(['timestamp,-2d']), data, OK_CODE)
self.check_data(RulesHelper().dash_dash_key_time(['timestamp,@-30m']), data, WARNING_CODE)
self.check_data(RulesHelper().dash_dash_key_time_critical(['timestamp,@-1h']), data, CRITICAL_CODE)
self.check_data(RulesHelper().dash_dash_key_time(['timestamp,@-3h']), data, WARNING_CODE)
self.check_data(RulesHelper().dash_dash_key_time_critical(['timestamp,@-2d']), data, CRITICAL_CODE)

48
test/test_cli.py Normal file
View File

@@ -0,0 +1,48 @@
#!/usr/bin/env python3
import unittest
import unittest.mock as mock
import sys
import os
sys.path.append('..')
from check_http_json import debugPrint
from check_http_json import verbosePrint
class CLITest(unittest.TestCase):
"""
Tests for CLI
"""
def setUp(self):
"""
Defining the exitcodes
"""
self.exit_0 = 0 << 8
self.exit_1 = 1 << 8
self.exit_2 = 2 << 8
self.exit_3 = 3 << 8
def test_debugprint(self):
with mock.patch('builtins.print') as mock_print:
debugPrint(True, 'debug')
mock_print.assert_called_once_with('debug')
def test_verbose(self):
with mock.patch('builtins.print') as mock_print:
verbosePrint(0, 3, 'verbose')
mock_print.assert_not_called()
verbosePrint(3, 3, 'verbose')
mock_print.assert_called_once_with('verbose')
def test_cli_without_params(self):
command = '/usr/bin/env python3 check_http_json.py > /dev/null 2>&1'
status = os.system(command)
self.assertEqual(status, self.exit_2)

131
test/test_main.py Normal file
View File

@@ -0,0 +1,131 @@
#!/usr/bin/env python3
import unittest
import unittest.mock as mock
import sys
import os
sys.path.append('..')
from check_http_json import main
class MockResponse():
def __init__(self, status_code=200, content='{"foo": "bar"}'):
self.status_code = status_code
self.content = content
def read(self):
return self.content
class MainTest(unittest.TestCase):
"""
Tests for Main
"""
@mock.patch('builtins.print')
def test_main_version(self, mock_print):
args = ['--version']
with self.assertRaises(SystemExit) as test:
main(args)
mock_print.assert_called_once()
self.assertEqual(test.exception.code, 0)
@mock.patch('builtins.print')
@mock.patch('urllib.request.urlopen')
def test_main_with_ssl(self, mock_request, mock_print):
args = '-H localhost --ssl'.split(' ')
mock_request.return_value = MockResponse()
with self.assertRaises(SystemExit) as test:
main(args)
self.assertEqual(test.exception.code, 0)
@mock.patch('builtins.print')
@mock.patch('urllib.request.urlopen')
def test_main_with_parse_error(self, mock_request, mock_print):
args = '-H localhost'.split(' ')
mock_request.return_value = MockResponse(content='not JSON')
with self.assertRaises(SystemExit) as test:
main(args)
self.assertTrue('Parser error' in str(mock_print.call_args))
self.assertEqual(test.exception.code, 3)
@mock.patch('builtins.print')
def test_main_with_url_error(self, mock_print):
args = '-H localhost'.split(' ')
with self.assertRaises(SystemExit) as test:
main(args)
self.assertTrue('URLError' in str(mock_print.call_args))
self.assertEqual(test.exception.code, 3)
@mock.patch('builtins.print')
@mock.patch('urllib.request.urlopen')
def test_main_with_http_error_no_json(self, mock_request, mock_print):
args = '-H localhost'.split(' ')
mock_request.return_value = MockResponse(content='not JSON', status_code=503)
with self.assertRaises(SystemExit) as test:
main(args)
self.assertTrue('Parser error' in str(mock_print.call_args))
self.assertEqual(test.exception.code, 3)
@mock.patch('builtins.print')
@mock.patch('urllib.request.urlopen')
def test_main_with_http_error_valid_json(self, mock_request, mock_print):
args = '-H localhost'.split(' ')
mock_request.return_value = MockResponse(status_code=503)
with self.assertRaises(SystemExit) as test:
main(args)
self.assertEqual(test.exception.code, 0)
@mock.patch('builtins.print')
def test_main_with_tls(self, mock_print):
args = ['-H', 'localhost',
'--ssl',
'--cacert',
'test/tls/ca-root.pem',
'--cert',
'test/tls/cert.pem',
'--key',
'test/tls/key.pem']
with self.assertRaises(SystemExit) as test:
main(args)
self.assertTrue('https://localhost' in str(mock_print.call_args))
self.assertEqual(test.exception.code, 3)
@mock.patch('builtins.print')
def test_main_with_tls_wrong_ca(self, mock_print):
args = ['-H', 'localhost',
'--ssl',
'--cacert',
'test/tls/key.pem',
'--cert',
'test/tls/cert.pem',
'--key',
'test/tls/key.pem']
with self.assertRaises(SystemExit) as test:
main(args)
self.assertTrue('Error loading SSL CA' in str(mock_print.call_args))
self.assertEqual(test.exception.code, 3)

51
test/test_nagioshelper.py Normal file
View File

@@ -0,0 +1,51 @@
#!/usr/bin/env python3
import json
import unittest
from unittest.mock import patch
import sys
sys.path.append('..')
from check_http_json import *
class NagiosHelperTest(unittest.TestCase):
"""
Tests for the NagiosHelper
"""
def test_getcode_default(self):
helper = NagiosHelper()
self.assertEqual(0, helper.getCode())
def test_getcode_warning(self):
helper = NagiosHelper()
helper.warning_message = 'foobar'
self.assertEqual(1, helper.getCode())
def test_getcode_critical(self):
helper = NagiosHelper()
helper.critical_message = 'foobar'
self.assertEqual(2, helper.getCode())
def test_getcode_unknown(self):
helper = NagiosHelper()
helper.unknown_message = 'foobar'
self.assertEqual(3, helper.getCode())
def test_getmessage_default(self):
helper = NagiosHelper()
self.assertEqual('OK: Status OK.', helper.getMessage())
def test_getmessage_perfomance_data(self):
helper = NagiosHelper()
helper.performance_data = 'foobar'
self.assertEqual('OK: foobar Status OK. |foobar', helper.getMessage())

65
test/testdata/README.md vendored Normal file
View File

@@ -0,0 +1,65 @@
# Example Data for Testing
Example calls:
```bash
python check_http_json.py -H localhost:8080 -p data0.json -q "age,20"
UNKNOWN: Status UNKNOWN. Could not find JSON in HTTP body.
```
```bash
python check_http_json.py -H localhost:8080 -p data1.json -e date
WARNING: Status WARNING. Key date did not exist.
python check_http_json.py -H localhost:8080 -p data1.json -E age
OK: Status OK.
python check_http_json.py -H localhost:8080 -p data1.json -w "age,30"
OK: Status OK.
python check_http_json.py -H localhost:8080 -p data1.json -w "age,20"
WARNING: Status WARNING. Value (30) for key age was outside the range 0:20.
python check_http_json.py -H localhost:8080 -p data1.json -q "age,20"
WARNING: Status WARNING. Key age mismatch. 20 != 30
```
```bash
python check_http_json.py -H localhost:8080 -p data2.json -q "(1).id,123"
WARNING: Status WARNING. Key (1).id mismatch. 123 != 2
python check_http_json.py -H localhost:8080 -p data2.json -Y "(1).id,2"
CRITICAL: Status CRITICAL. Key (1).id match found. 2 == 2
python check_http_json.py -H localhost:8080 -p data2.json -E "(1).author"
OK: Status OK.
python check_http_json.py -H localhost:8080 -p data2.json -E "(1).pages"
CRITICAL: Status CRITICAL. Key (1).pages did not exist.
```
```bash
python check_http_json.py -H localhost:8080 -p data3.json -q "company.employees.(0).role,Developer"
OK: Status OK.
python check_http_json.py -H localhost:8080 -p data3.json -q "company.employees.(0).role,Dev"
WARNING: Status WARNING. Key company.employees.(0).role mismatch. Dev != Developer
python check_http_json.py -H localhost:8080 -p data3.json -q "company.employees.(0).role,Developer" "company.employees.(1).role,Designer"
OK: Status OK.
```
```bash
python check_http_json.py -H localhost:8080 -p data4.json -u "ratings(0),4.5"
OK: Status OK.
python check_http_json.py -H localhost:8080 -p data4.json -u "ratings(0),4.1"
UNKNOWN: Status UNKNOWN. Key ratings(0) mismatch. 4.1 != 4.5
```
```bash
python check_http_json.py -H localhost:8080 -p data5.json -q service1.status,True service2.status,True service3.status,True
OK: Status OK.
python check_http_json.py -H localhost:8080 -p data5.json -q "service1.status,True" -q "service2.status,True" -q "service3.status,False"
```

1
test/testdata/data0-invalid.json vendored Normal file
View File

@@ -0,0 +1 @@
No JSON

5
test/testdata/data1.json vendored Normal file
View File

@@ -0,0 +1,5 @@
{
"name": "John Doe",
"age": 30,
"city": "New York"
}

17
test/testdata/data2.json vendored Normal file
View File

@@ -0,0 +1,17 @@
[
{
"id": 1,
"title": "Book One",
"author": "Author One"
},
{
"id": 2,
"title": "Book Two",
"author": "Author Two"
},
{
"id": 3,
"title": "Book Three",
"author": "Author Three"
}
]

18
test/testdata/data3.json vendored Normal file
View File

@@ -0,0 +1,18 @@
{
"company": {
"name": "Tech Corp",
"location": "San Francisco",
"employees": [
{
"name": "Alice",
"role": "Developer"
},
{
"name": "Bob",
"role": "Designer"
}
]
},
"founded": 2010,
"industry": "Technology"
}

13
test/testdata/data4.json vendored Normal file
View File

@@ -0,0 +1,13 @@
{
"id": 123,
"active": true,
"tags": ["tech", "startup", "innovation"],
"details": {
"website": "https://example.com",
"contact": {
"email": "info@example.com",
"phone": "+1-234-567-890"
}
},
"ratings": [4.5, 4.7, 4.8]
}

38
test/testdata/data5.json vendored Normal file
View File

@@ -0,0 +1,38 @@
{
"service1": {
"status": true
},
"service2": {
"status": true,
"meta": {
"res": "PONG"
}
},
"service3": {
"status": true,
"meta": {
"took": 9,
"timed_out": false,
"_shards": {
"total": 0,
"successful": 0,
"skipped": 0,
"failed": 0
},
"hits": {
"total": {
"value": 10000,
"relation": "gte"
},
"max_score": null,
"hits": []
}
}
},
"service4": {
"status": true,
"meta": {
"status": "ok"
}
}
}

7
test/testdata/docker-compose.yml vendored Normal file
View File

@@ -0,0 +1,7 @@
services:
nginx:
image: nginx:1-alpine
ports:
- "8080:80"
volumes:
- ./:/usr/share/nginx/html

21
test/tls/ca-root.pem Normal file
View File

@@ -0,0 +1,21 @@
-----BEGIN CERTIFICATE-----
MIIDbTCCAlWgAwIBAgIUB6EZDl3ajJgJsoLzyC9DrOQQpKowDQYJKoZIhvcNAQEN
BQAwRTELMAkGA1UEBhMCQVUxEzARBgNVBAgMClNvbWUtU3RhdGUxITAfBgNVBAoM
GEludGVybmV0IFdpZGdpdHMgUHR5IEx0ZDAgFw0yNDAzMTgwODE5MDhaGA8yMDUx
MDgwMzA4MTkwOFowRTELMAkGA1UEBhMCQVUxEzARBgNVBAgMClNvbWUtU3RhdGUx
ITAfBgNVBAoMGEludGVybmV0IFdpZGdpdHMgUHR5IEx0ZDCCASIwDQYJKoZIhvcN
AQEBBQADggEPADCCAQoCggEBALVxioj+6zw6Snr+B1JOivC8Of6YptVYym5ICiHX
wjpbSVVe+Py/P2LDb/uQ1QkAENlpvChFqSaRBZU5keXYS/DaFb2Evb2/zf5qIdWU
2ju8B5V13gXSeaNNetyEn1Ivvk0lOCQo2RwEZXuStpLS4Q32rkRBvkoL+RXDc1NX
c3RwcU1p9ybgBqAC7FYdV82sgHGugIrbzkjfFREJXp1AnqvKAdk39b1CnPxfmPZC
nzPPetfr3iivH8yVO5rodU/LDtQNph22JR94YvPB89QO+bZ9bw2GHtPdAKFew9HF
UxM1fmy381Mq2iS3KUq5vsC1jMe8slUAIFYEDzoPvOz+MpcCAwEAAaNTMFEwHQYD
VR0OBBYEFOmCb+JnMzX29hwgtXSzrN+m6mTDMB8GA1UdIwQYMBaAFOmCb+JnMzX2
9hwgtXSzrN+m6mTDMA8GA1UdEwEB/wQFMAMBAf8wDQYJKoZIhvcNAQENBQADggEB
AAkTD8K4UO4uO4i6p2BCofbhVm9LYA0ulmLO8Uro0U491TeEDOQpgMFAK+b2gZIU
zvDHoCMn3UPVxHKl7XzDgLZVkYYEc2s9vArxk5vSnFmh3XvlDu2SO5gSLB2sf68A
2+Jz2x6z9tjWWdZCGJWU/iwMbG2Y3JMHyv1NMF8cyOclJaSDNBAwF5c5sdlGTLKb
WHGXzVqHSAFlGcHtQrcEKclHiuzw2G3LZzwghGk0XzxwvyKrnAEy408RY0mfNLtz
32KHqYtrip0RYlGWKP7/7q6i0D8muEFW/I4emFI0z0I/1CcYZZS8tQkWaPf/wCN0
llTD1kKJACsIMaqkkyy+EZM=
-----END CERTIFICATE-----

19
test/tls/cert.pem Normal file
View File

@@ -0,0 +1,19 @@
-----BEGIN CERTIFICATE-----
MIIDDzCCAfcCFBOrBcHIH2x9xcUyUeDid0cvBxWtMA0GCSqGSIb3DQEBDQUAMEUx
CzAJBgNVBAYTAkFVMRMwEQYDVQQIDApTb21lLVN0YXRlMSEwHwYDVQQKDBhJbnRl
cm5ldCBXaWRnaXRzIFB0eSBMdGQwIBcNMjQwMzE4MDgxOTM1WhgPMjA1MTA4MDMw
ODE5MzVaMEUxCzAJBgNVBAYTAkFVMRMwEQYDVQQIDApTb21lLVN0YXRlMSEwHwYD
VQQKDBhJbnRlcm5ldCBXaWRnaXRzIFB0eSBMdGQwggEeMA0GCSqGSIb3DQEBAQUA
A4IBCwAwggEGAoH+ALuzyIhEATF5YyAOsXKfr2mttF2HyJvEscGcoA7YetT57bjJ
5lg944kc3QH/wTEdrGda3cwh3OXdUuyR7Wrm9jPw38hMArx/fWPkiISOShrUSHGd
Qyy2bT+zxBaUo+pomyrlqlgwGlbxuwTAlTSFcI+i7yXrckl2HRj40EW4FNsYpPzv
maxRXs0kg0J2JLTYF+fHlqlYbSX/hRU9wz2DYfkRSS0+OYJNSmqK0jayUsdZYurG
gbPwOCgQ0QxLLh7P8z4sOanRowqUzqTI77cyUugEJRyoi+LJr4r0EwMTBX3STgPh
S9B78+LNvwOrLrZFUhr144RfO9QPLnz0uWcCAwEAATANBgkqhkiG9w0BAQ0FAAOC
AQEAeIR21PfLzgpL7WDBE2KgwI78nVc1wY9nwoAxSBzHjS0Olve3r9MaVzAKn5ZS
xHtv8oroXjhTcczCIzxii6Imp6u0iIr3QVBIceofxrH3aWmICURcC9l+dIiY6sk9
Ct8P8gm/Erv2iF/7bnsARwDnw0f41fC9eXtHZ7WLRQrc7tLHpjL0Z7bT77ysQJVK
C1SWtBnq3afmwH3R1wVHENn0JVFQpBp+vqWU5KIlvjcz49yPU+aNODk1rJsHMlgS
x2iddwF31GNOxNfXtw8fdw4UDUl2wYoZ45w2e2pXt4pbN43m0Wys1eQZdk3tyR6G
AZOLP05073mLtbVlFRmcTdXIGg==
-----END CERTIFICATE-----

27
test/tls/key.pem Normal file
View File

@@ -0,0 +1,27 @@
-----BEGIN PRIVATE KEY-----
MIIEqAIBADANBgkqhkiG9w0BAQEFAASCBJIwggSOAgEAAoH+ALuzyIhEATF5YyAO
sXKfr2mttF2HyJvEscGcoA7YetT57bjJ5lg944kc3QH/wTEdrGda3cwh3OXdUuyR
7Wrm9jPw38hMArx/fWPkiISOShrUSHGdQyy2bT+zxBaUo+pomyrlqlgwGlbxuwTA
lTSFcI+i7yXrckl2HRj40EW4FNsYpPzvmaxRXs0kg0J2JLTYF+fHlqlYbSX/hRU9
wz2DYfkRSS0+OYJNSmqK0jayUsdZYurGgbPwOCgQ0QxLLh7P8z4sOanRowqUzqTI
77cyUugEJRyoi+LJr4r0EwMTBX3STgPhS9B78+LNvwOrLrZFUhr144RfO9QPLnz0
uWcCAwEAAQKB/UQAYzMy5/fDkWzoxdLQFV3E56ZG7h+4x+lr0/Ts6rtD/KLIyqHH
ciqXgV4bCSPBK1eabOZqkjvYzhUU3R2wpRu2NWy8VPVzfrr07ZyQbDqCE+jNX6vQ
P44nk2/W0/e1hBmrcOZYLwK2utmC58tKWLhBAEENpq8EkpAcfF/1y9aRHKYwNnH7
vouoQibN5NTs5m8s0VyjRTDwRZja98eWnn5NfU3orqYO8fSlF6CyzDtoyhMco6zR
0skBgMzRYCRTuJpV+KekC7XFYyiJ6XZN5DKLbbqP6Y7YR8wjyFEruoGCS0mZH2H0
9/rhTsJram1B2zohXHPsHJGGGv12/7kCfw5C7yda+8Yv0NmRp1F+EJYb75SCAWIP
kzN/xvjP2bMKa6oSzU0DOga3Wc4ijJHDaND8rqdPqQe3zXFr1nPdBrybLSJ6k5CN
4Dd6ENJWVWino0L460kpLtlBG6TsgmB8bkwhjWVE6Vgt4Vila+a3TGRXeniaRzdw
icNOtMrjYlUCfw0pWEvO2uFq0DbNZbmzC2j5ClFcU96CAl4AqKG2PiGnuSy9TKVZ
c5OiXFmyoig7v4LJzaKLSqVIN4hVBU80/MlhvG+dpeimvLaQKNtlZQethIs5hXlB
R1XfaPhq6BQiYmQ3tufyS/0Es2OY+Cs3LU1uDB8qVzonlmnIi69OwMsCfwRPISfJ
C+4UIIy8v8uVxbk1c6xxo61Xe2jCIQKo+uRoL6PRzoqIgQ3qdI4eTk70tkT/NF6F
aVNVrBOrO78Cd7ihQn/6fX/d/nOExHRpdaELlf70a1NNyEQIsiug8rvonQMP2ENT
ERZ9tmssgG/Tzpc6/1xVcVNFA7spmuL61YkCfwnu2zGTc0PO7kd96rkktIbL9YqD
6NQ0QH8bdildtjSGNc3bLB5ajUytq48Sryk4NogJr8Vt5K8q+qZMrE4kCmgd+C4w
x4b3V9Ncp0k1k/MgdLjyd5aUurbHfpyFapPPg3xpRAR3q/vP8WdIintrECiw1jsr
JFvChtVdQnbTM9MCfw41RcjNwCaIG+uXc8bD6Yf+NyXD8zP6ZDywmBlkMWlGSzx4
xM8J+wQiQsNWthDBbF7inJc+lbtJiEe4YOPkbjCYVZRHribL65HKJlEUv6M9bvQo
3P1DS5tDrwo6z9UPs4tD1SgF9fDu/xA7fwPF1RTvuW07MhFJWlDo4FSWS9c=
-----END PRIVATE KEY-----