├── debian ├── compat ├── source │ ├── format │ ├── options │ └── include-binaries ├── conn-check.links ├── conn-check.triggers ├── copyright ├── rules ├── control └── changelog ├── conn_check ├── utils │ ├── __init__.py │ ├── firewall_rules.py │ └── convert_fw_rules.py ├── version.txt ├── __init__.py ├── patterns.py ├── check_impl.py ├── main.py └── checks.py ├── amqp-requirements.txt ├── fwutils-requirements.txt ├── postgres-requirements.txt ├── redis-requirements.txt ├── mongodb-requirements.txt ├── docs ├── history.rst ├── readme.rst ├── index.rst ├── tutorial-part-3.rst ├── tutorial-part-1.rst ├── Makefile ├── tutorial-part-2.rst ├── make.bat └── conf.py ├── test-requirements.txt ├── tarmac_tests.sh ├── deb-dependencies.txt ├── requirements.txt ├── MANIFEST.in ├── devel-requirements.txt ├── .bzrignore ├── demo.yaml ├── setup.py ├── CHANGELOG ├── Makefile ├── README.rst ├── tests.py └── LICENSE /debian/compat: -------------------------------------------------------------------------------- 1 | 7 2 | -------------------------------------------------------------------------------- /conn_check/utils/__init__.py: -------------------------------------------------------------------------------- 1 | -------------------------------------------------------------------------------- /amqp-requirements.txt: -------------------------------------------------------------------------------- 1 | txAMQP 2 | -------------------------------------------------------------------------------- /conn_check/version.txt: -------------------------------------------------------------------------------- 1 | 1.3.1 2 | -------------------------------------------------------------------------------- /debian/source/format: -------------------------------------------------------------------------------- 1 | 3.0 (quilt) 2 | -------------------------------------------------------------------------------- /fwutils-requirements.txt: -------------------------------------------------------------------------------- 1 | netaddr 2 | -------------------------------------------------------------------------------- /postgres-requirements.txt: -------------------------------------------------------------------------------- 1 | psycopg2 2 | -------------------------------------------------------------------------------- /redis-requirements.txt: -------------------------------------------------------------------------------- 1 | txredis 2 | -------------------------------------------------------------------------------- /mongodb-requirements.txt: -------------------------------------------------------------------------------- 1 | txmongo >= 0.5 2 | -------------------------------------------------------------------------------- /docs/history.rst: -------------------------------------------------------------------------------- 1 | .. include:: ../CHANGELOG 2 | -------------------------------------------------------------------------------- /docs/readme.rst: -------------------------------------------------------------------------------- 1 | .. include:: ../README.rst 2 | -------------------------------------------------------------------------------- /test-requirements.txt: -------------------------------------------------------------------------------- 1 | nose 2 | testtools 3 | -------------------------------------------------------------------------------- /debian/conn-check.links: -------------------------------------------------------------------------------- 1 | /usr/share/python/conn-check/bin/conn-check /usr/bin/conn-check 2 | -------------------------------------------------------------------------------- /debian/source/options: -------------------------------------------------------------------------------- 1 | extend-diff-ignore="(conn_check\.egg-info|virtualenv|build.*|dist|Makefile|docs)" -------------------------------------------------------------------------------- /tarmac_tests.sh: -------------------------------------------------------------------------------- 1 | #!/bin/sh 2 | set -e 3 | 4 | # Build and test conn-check 5 | make clean build test docs 6 | -------------------------------------------------------------------------------- /deb-dependencies.txt: -------------------------------------------------------------------------------- 1 | gcc 2 | libffi-dev 3 | libpq-dev 4 | python-dev 5 | python-pip 6 | python-virtualenv 7 | -------------------------------------------------------------------------------- /requirements.txt: -------------------------------------------------------------------------------- 1 | characteristic >= 14.0 2 | cryptography >= 0.5 3 | ndg-httpsclient 4 | pyasn1 5 | pyOpenSSL 6 | service-identity 7 | requests 8 | Twisted 9 | txrequests >= 0.9 10 | PyYAML 11 | -------------------------------------------------------------------------------- /MANIFEST.in: -------------------------------------------------------------------------------- 1 | include LICENSE 2 | include CHANGELOG 3 | include *.rst 4 | include *.txt 5 | include tarmac_tests.sh 6 | include demo.yaml 7 | include tests.py 8 | recursive-include conn_check *.txt 9 | recursive-include conn_check *.xml 10 | -------------------------------------------------------------------------------- /devel-requirements.txt: -------------------------------------------------------------------------------- 1 | -r test-requirements.txt 2 | 3 | -r amqp-requirements.txt 4 | -r mongodb-requirements.txt 5 | -r redis-requirements.txt 6 | -r postgres-requirements.txt 7 | -r fwutils-requirements.txt 8 | 9 | -r requirements.txt 10 | -------------------------------------------------------------------------------- /.bzrignore: -------------------------------------------------------------------------------- 1 | build 2 | conn_check.egg-info 3 | dist 4 | wheels 5 | virtualenv 6 | .pc 7 | debian/pythoncache 8 | debian/conn-check 9 | debian/conn-check.debhelper.log 10 | debian/conn-check.postinst.debhelper 11 | debian/conn-check.substvars 12 | debian/files 13 | debian-requirements-filtered.txt 14 | _build 15 | -------------------------------------------------------------------------------- /debian/conn-check.triggers: -------------------------------------------------------------------------------- 1 | # Register interest in Python interpreter changes (Python 2 for now); and 2 | # don't make the Python package dependent on the virtualenv package 3 | # processing (noawait) 4 | interest-noawait /usr/bin/python2.6 5 | interest-noawait /usr/bin/python2.7 6 | 7 | # Also provide a symbolic trigger for all dh-virtualenv packages 8 | interest dh-virtualenv-interpreter-update 9 | -------------------------------------------------------------------------------- /docs/index.rst: -------------------------------------------------------------------------------- 1 | Welcome to the conn-check documentation. 2 | ======================================== 3 | 4 | Contents: 5 | 6 | .. toctree:: 7 | :maxdepth: 2 8 | 9 | readme 10 | tutorial-part-1 11 | tutorial-part-2 12 | tutorial-part-3 13 | History 14 | 15 | 16 | Indices and tables 17 | ================== 18 | 19 | * :ref:`genindex` 20 | * :ref:`modindex` 21 | * :ref:`search` 22 | 23 | -------------------------------------------------------------------------------- /conn_check/__init__.py: -------------------------------------------------------------------------------- 1 | #!/usr/bin/python -uWignore 2 | """Check connectivity to various services. 3 | """ 4 | 5 | import os 6 | 7 | def get_version_string(): 8 | return open(os.path.join(os.path.dirname(__file__), 9 | 'version.txt'), 'r').read().strip() 10 | 11 | 12 | def get_version(): 13 | return get_version_string().split('.') 14 | 15 | 16 | __version__ = get_version_string() 17 | 18 | 19 | from twisted.internet import epollreactor 20 | epollreactor.install() 21 | -------------------------------------------------------------------------------- /demo.yaml: -------------------------------------------------------------------------------- 1 | - type: tcp 2 | host: login.ubuntu.com 3 | port: 80 4 | - type: udp 5 | host: localhost 6 | port: 8080 7 | send: aaaa 8 | expect: bbbb 9 | tags: 10 | - foo 11 | - type: tls 12 | host: login.ubuntu.com 13 | port: 443 14 | verify: true 15 | tags: 16 | - bar 17 | - type: redis 18 | host: 127.0.0.1 19 | port: 6379 20 | password: foobared 21 | - type: http 22 | url: https://login.ubuntu.com/ 23 | tags: 24 | - foo 25 | - bar 26 | - type: memcache 27 | host: 127.0.0.1 28 | port: 11211 29 | -------------------------------------------------------------------------------- /debian/copyright: -------------------------------------------------------------------------------- 1 | Format: http://www.debian.org/doc/packaging-manuals/copyright-format/1.0/ 2 | Upstream-Name: conn-check 3 | Upstream-Contact: Wes Mason 4 | Source: https://launchpad.net/conn-check 5 | 6 | Files: * 7 | Copyright: 2014, Canonical Ltd. 8 | License: GPL-3+ 9 | 10 | License: GPL-3+ 11 | Ansible is free software: you can redistribute it and/or modify 12 | it under the terms of the GNU General Public License as published by 13 | the Free Software Foundation, either version 3 of the License, or 14 | (at your option) any later version. 15 | . 16 | Ansible is distributed in the hope that it will be useful, 17 | but WITHOUT ANY WARRANTY; without even the implied warranty of 18 | MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the 19 | GNU General Public License for more details. 20 | . 21 | You should have received a copy of the GNU General Public License 22 | along with Ansible. If not, see . 23 | . 24 | On Debian systems, the full text of the GNU General Public 25 | License version 3 can be found in the file 26 | `/usr/share/common-licenses/GPL-3'. 27 | 28 | -------------------------------------------------------------------------------- /debian/rules: -------------------------------------------------------------------------------- 1 | #!/usr/bin/make -f 2 | 3 | %: 4 | dh $@ --with python-virtualenv 5 | 6 | REQUIREMENTS_ARGS := --extra-pip-arg "-r" --extra-pip-arg "./redis-requirements.txt" \ 7 | --extra-pip-arg "-r" --extra-pip-arg "amqp-requirements.txt" \ 8 | --extra-pip-arg "-r" --extra-pip-arg "mongodb-requirements.txt" \ 9 | --extra-pip-arg "-r" --extra-pip-arg "postgres-requirements.txt" \ 10 | --preinstall ipaddress \ 11 | --preinstall idna 12 | override_dh_virtualenv: 13 | dh_virtualenv --pypi-url file://$(CURDIR)/debian/pythoncache/simple --use-system-packages $(REQUIREMENTS_ARGS) 14 | sed -i '/^requests/d' ./debian/conn-check/usr/share/python/conn-check/lib/python2.7/site-packages/txrequests-*.egg-info/requires.txt 15 | sed -i '/^requests/d' ./debian/conn-check/usr/share/python/conn-check/lib/python2.7/site-packages/cryptography-*.egg-info/requires.txt 16 | sed -i '/^six/d' ./debian/conn-check/usr/share/python/conn-check/lib/python2.7/site-packages/cryptography-*.egg-info/requires.txt 17 | 18 | override_dh_auto_clean: 19 | python setup.py clean -a 20 | find . -name \*.pyc -exec rm {} \; 21 | 22 | override_dh_auto_build: 23 | override_dh_auto_test: 24 | -------------------------------------------------------------------------------- /debian/control: -------------------------------------------------------------------------------- 1 | Source: conn-check 2 | Maintainer: Wes Mason 3 | Section: admin 4 | Priority: optional 5 | Build-Depends: python:any (>= 2.6.6-3), debhelper (>= 7.4.3), dh-virtualenv, python-dev, python-setuptools, libffi-dev, libssl-dev , python-twisted, python-yaml, python-openssl, python-requests, python-pyasn1, python-pyasn1-modules, python-zope.interface, python-cffi, python-six, python-enum34, python-psycopg2, python-txamqp, python-pymongo 6 | Standards-Version: 3.9.5 7 | Homepage: https://launchpad.net/conn-check 8 | 9 | Package: conn-check 10 | Architecture: all 11 | Pre-Depends: dpkg (>= 1.16.1), python2.7-minimal | python2.6-minimal, ${misc:Pre-Depends} 12 | Depends: ${misc:Depends}, ${python:Depends}, python-twisted, python-yaml, python-openssl, python-requests, python-pyasn1, python-pyasn1-modules, python-zope.interface, python-cffi, python-six, python-enum34 13 | Suggests: python-psycopg2, python-txamqp, python-pymongo 14 | Description: Utility/library for checking connectivity between services 15 | conn-check allows for checking connectivity with external services. 16 | You can write a config file that defines services that you need to 17 | have access to, and conn-check will check connectivity with each. 18 | It supports various types of services, all of which allow for 19 | basic network checks, but some allow for confirming that 20 | credentials also work. 21 | -------------------------------------------------------------------------------- /setup.py: -------------------------------------------------------------------------------- 1 | """Installer for conn-check 2 | """ 3 | 4 | import os 5 | cwd = os.path.dirname(__file__) 6 | __version__ = open(os.path.join(cwd, 'conn_check/version.txt'), 7 | 'r').read().strip() 8 | 9 | from setuptools import setup, find_packages 10 | 11 | 12 | def get_requirements(*pre): 13 | extras = [] 14 | 15 | # Base requirements 16 | if not pre: 17 | pre = ('',) 18 | 19 | for p in pre: 20 | sep = '-' if p else '' 21 | extras.extend(open('{}{}requirements.txt'.format(p, sep)).readlines()) 22 | return extras 23 | 24 | 25 | setup( 26 | name='conn-check', 27 | description='Utility for verifying connectivity between services', 28 | long_description=open('README.rst').read(), 29 | version=__version__, 30 | author='James Westby, Wes Mason', 31 | author_email='james.westby@canonical.com, wesley.mason@canonical.com', 32 | url='http://conn-check.org/', 33 | packages=find_packages(exclude=['ez_setup']), 34 | install_requires=get_requirements(), 35 | extras_require={ 36 | 'all': get_requirements('amqp', 'postgres', 'redis', 'mongodb', 37 | 'fwutils'), 38 | 'amqp': get_requirements('amqp'), 39 | 'postgres': get_requirements('postgres'), 40 | 'redis': get_requirements('redis'), 41 | 'mongodb': get_requirements('mongodb'), 42 | 'fwutil': get_requirements('fwutils'), 43 | }, 44 | package_data={'conn_check': ['version.txt', 'amqp0-8.xml']}, 45 | include_package_data=True, 46 | entry_points={ 47 | 'console_scripts': [ 48 | 'conn-check = conn_check.main:main', 49 | 'conn-check-export-fw = conn_check.utils.firewall_rules:main', 50 | 'conn-check-convert-fw = conn_check.utils.convert_fw_rules:main', 51 | ], 52 | }, 53 | license='GPL3', 54 | classifiers=[ 55 | "Topic :: System :: Networking", 56 | "Development Status :: 4 - Beta", 57 | "Programming Language :: Python", 58 | "Intended Audience :: Developers", 59 | "Operating System :: OS Independent", 60 | "Intended Audience :: System Administrators", 61 | "License :: OSI Approved :: GNU General Public License v3 (GPLv3)", 62 | ] 63 | ) 64 | -------------------------------------------------------------------------------- /debian/source/include-binaries: -------------------------------------------------------------------------------- 1 | conn_check/__init__.pyc 2 | conn_check/check_impl.pyc 3 | conn_check/checks.pyc 4 | conn_check/main.pyc 5 | conn_check/patterns.pyc 6 | conn_check/utils/__init__.pyc 7 | conn_check/utils/firewall_rules.pyc 8 | conn_check/utils/convert_fw_rules.pyc 9 | tests.pyc 10 | debian/pythoncache 11 | debian/pythoncache/idna-2.0-py2.py3-none-any.whl 12 | debian/pythoncache/cryptography-0.9.1.tar.gz 13 | debian/pythoncache/txredis-2.3.tar.gz 14 | debian/pythoncache/enum34-1.0.4.tar.gz 15 | debian/pythoncache/txrequests-0.9.2.tar.gz 16 | debian/pythoncache/ndg_httpsclient-0.4.0.tar.gz 17 | debian/pythoncache/ipaddress-1.0.7-py27-none-any.whl 18 | debian/pythoncache/txmongo-15.1.0-py2.py3-none-any.whl 19 | debian/pythoncache/characteristic-14.3.0-py2.py3-none-any.whl 20 | debian/pythoncache/cryptography-0.9.3.tar.gz 21 | debian/pythoncache/ipaddress-1.0.12-py27-none-any.whl 22 | debian/pythoncache/ipaddress-1.0.14-py27-none-any.whl 23 | debian/pythoncache/netaddr-0.7.15-py2.py3-none-any.whl 24 | debian/pythoncache/txmongo-15.0.0-py2.py3-none-any.whl 25 | debian/pythoncache/service_identity-14.0.0-py2.py3-none-any.whl 26 | debian/pythoncache/cryptography-0.9.tar.gz 27 | debian/pythoncache/simple 28 | debian/pythoncache/simple/cryptography 29 | debian/pythoncache/simple/cryptography/cryptography-0.9.1.tar.gz 30 | debian/pythoncache/simple/cryptography/cryptography-0.9.3.tar.gz 31 | debian/pythoncache/simple/cryptography/cryptography-0.9.tar.gz 32 | debian/pythoncache/simple/txrequests 33 | debian/pythoncache/simple/txrequests/txrequests-0.9.2.tar.gz 34 | debian/pythoncache/simple/txredis 35 | debian/pythoncache/simple/txredis/txredis-2.3.tar.gz 36 | debian/pythoncache/simple/ndg-httpsclient 37 | debian/pythoncache/simple/ndg-httpsclient/ndg-httpsclient-0.4.0.tar.gz 38 | debian/pythoncache/simple/ipaddress 39 | debian/pythoncache/simple/ipaddress/ipaddress-1.0.7-py27-none-any.whl 40 | debian/pythoncache/simple/ipaddress/ipaddress-1.0.12-py27-none-any.whl 41 | debian/pythoncache/simple/ipaddress/ipaddress-1.0.14-py27-none-any.whl 42 | debian/pythoncache/simple/characteristic 43 | debian/pythoncache/simple/characteristic/characteristic-14.3.0-py2.py3-none-any.whl 44 | debian/pythoncache/simple/enum34 45 | debian/pythoncache/simple/enum34/enum34-1.0.4.tar.gz 46 | debian/pythoncache/simple/service-identity 47 | debian/pythoncache/simple/service-identity/service-identity-14.0.0-py2.py3-none-any.whl 48 | debian/pythoncache/simple/netaddr 49 | debian/pythoncache/simple/netaddr/netaddr-0.7.15-py2.py3-none-any.whl 50 | debian/pythoncache/simple/txmongo 51 | debian/pythoncache/simple/txmongo/txmongo-15.1.0-py2.py3-none-any.whl 52 | debian/pythoncache/simple/txmongo/txmongo-15.0.0-py2.py3-none-any.whl 53 | debian/pythoncache/simple/idna 54 | debian/pythoncache/simple/idna/idna-2.0-py2.py3-none-any.whl 55 | -------------------------------------------------------------------------------- /conn_check/utils/firewall_rules.py: -------------------------------------------------------------------------------- 1 | import sys 2 | import socket 3 | import yaml 4 | 5 | from ..main import Command, parse_version_arg 6 | 7 | 8 | class FirewallRulesOutput(object): 9 | """Outputs a set of YAML firewall rules matching checks.""" 10 | 11 | def __init__(self, output): 12 | self.output = output 13 | self.output_data = {} 14 | self.fqdn = socket.getfqdn() 15 | 16 | def notify_skip(self, name): 17 | """Passes skips. 18 | 19 | Should pass everything when using the skipping_strategy, directly to 20 | write(). 21 | """ 22 | 23 | self.write(name) 24 | 25 | def write(self, data): 26 | """Filters out non-TCP/UDP checks. 27 | 28 | Stores host/port/proto info for output later as YAML. 29 | """ 30 | 31 | # We only need TCP/UDP checks 32 | if not any(x in data for x in ('tcp', 'udp')): 33 | return 34 | 35 | # Here we take the list of colon separated values in reverse order, so 36 | # we should get just the host/port/proto for the check without the 37 | # specific prefix (e.g. memcache, http) 38 | port, host, protocol = reversed(data.split(':')[-3:]) 39 | protocol = protocol.strip() 40 | 41 | key = "{}:{}".format(host, protocol) 42 | if key not in self.output_data: 43 | self.output_data[key] = { 44 | 'from_host': self.fqdn, 45 | 'to_host': host, 46 | 'ports': [], 47 | 'protocol': protocol, 48 | } 49 | 50 | port = int(port) 51 | if port not in self.output_data[key]['ports']: 52 | self.output_data[key]['ports'].append(port) 53 | 54 | def flush(self): 55 | """Outputs our structured egress firewall info as YAML.""" 56 | 57 | self.output.write(yaml.dump({'egress': self.output_data.values()})) 58 | 59 | 60 | class FirewallExportCommand(Command): 61 | """CLI command runner for conn-check-export-fw""" 62 | 63 | def wrap_output(self, output): 64 | """Wraps output stream. 65 | 66 | Override some options in order to just output fw rules without 67 | performing checks. 68 | """ 69 | # We don't want to actually perform the checks 70 | self.options.dry_run = True 71 | self.options.buffer_output = False 72 | self.options.show_duration = False 73 | 74 | super(FirewallExportCommand, self).wrap_output(output) 75 | 76 | self.output = FirewallRulesOutput(self.output) 77 | self.results.output = self.output 78 | 79 | 80 | def run(*args): 81 | if parse_version_arg(): 82 | return 0 83 | 84 | cmd = FirewallExportCommand(args) 85 | return cmd.run() 86 | 87 | 88 | def main(): 89 | sys.exit(run(*sys.argv[1:])) 90 | 91 | 92 | if __name__ == '__main__': 93 | main() 94 | -------------------------------------------------------------------------------- /CHANGELOG: -------------------------------------------------------------------------------- 1 | ChangeLog for conn-check 2 | ======================== 3 | 4 | 1.3.1 (2015-08-11) 5 | ------------------ 6 | 7 | - Added guards for port numbers and the HTTP checks expected_code to cast 8 | any given value to an int. 9 | 10 | 1.3.0 (2015-07-15) 11 | ------------------ 12 | 13 | - Added new conn-check-convert-fw tool to generate aws/neutron/nova/iptables rule 14 | commands from YAML exported by conn-check-export-fw. 15 | 16 | 1.2.0 (2015-06-19) 17 | ------------------ 18 | 19 | - Added new smtp check to test auth/sending with SMTP servers. 20 | 21 | 1.1.0 (2015-06-05) 22 | ------------------ 23 | 24 | - Added new conn-check-export-fw tool to export firewall egress rules in a YAML format. 25 | - Refactored CLI command handling code to make it easier to extend/override. 26 | 27 | 1.0.18 (2015-04-13) 28 | ------------------- 29 | 30 | - Ensure pyOpenSSL is always used instead of the ssl modules, 31 | see https://urllib3.readthedocs.org/en/latest/security.html#pyopenssl. 32 | 33 | 1.0.17 (2015-04-08) 34 | ------------------- 35 | 36 | - Unpin python-requests for wider distribution (e.g. precise). 37 | 38 | 1.0.16 (2015-03-06) 39 | ------------------- 40 | 41 | - Add --include-tags and --exclude-tags args with support for the `tags` YAML check field. 42 | 43 | 1.0.15 (2015-02-24) 44 | ------------------- 45 | 46 | - Package manifest fixes for debian package release. 47 | 48 | 49 | 1.0.13 (26-11-2014) 50 | ------------------- 51 | 52 | - Output is not buffered and ordered, with FAILED checks first, skipped last, 53 | and each check grouped by {type}:{host/url}. 54 | - TCP subchecks triggered by a HTTP check are prefixed as such. 55 | - There is now a -U/--unbuffered-output option to disable buffered/ordered output 56 | and write out to STDOUT as soon as a result is collected. 57 | 58 | 1.0.12 (17-11-2014) 59 | ------------------- 60 | 61 | - Command aliasing refactored, and more aliases added. 62 | 63 | 1.0.11 (04-11-2014) 64 | ------------------- 65 | 66 | - Disabled 30x redirects in HTTP checks by default, fixing regression introduced by requests switch. 67 | - Added python-requests specific options for proxy, param, cookie and auth control in HTTP checks. 68 | 69 | 1.0.10 (30-10-2014) 70 | ------------------- 71 | 72 | - Added a mongodb check type. 73 | 74 | 1.0.9 (23-10-2014) 75 | ------------------ 76 | 77 | - Added --max-timeout CLI option to restrict maximum execution time. 78 | - Added connection timeouts to HTTP and PostgreSQL checks. 79 | - Added --connect-timeout CLI option to set global connection timeout. 80 | - Added timeout option to each individual check to override global connection timeout. 81 | 82 | 1.0.8 (22-10-2014) 83 | ------------------ 84 | 85 | - Switched to using txrequests for HTTP requests with better proxy support. 86 | - Fixed UDP checks targetting host rather than IP if available. 87 | - Fixed initial TCP check for HTTP checks targetting upstream instead of proxy. 88 | 89 | 1.0.7 (09-10-2014) 90 | ------------------ 91 | 92 | - Fixed HTTP proxy error in HTTP checks due to typo. 93 | 94 | 1.0.6 (06-10-2014) 95 | 96 | - Fixed dependencies when installing from local dir. 97 | - Made improvements to readme. 98 | 99 | 100 | 1.0.5 (03-10-2014) 101 | ------------------ 102 | 103 | - Added optional headers and body arguments to HTTP checks. 104 | 105 | 106 | 1.0.4 (29-09-2014) 107 | ------------------ 108 | 109 | - Added HTTP proxy support to http checks 110 | - Fixed issue with loading duplicate SSL CA certificates, and added flag to load from a custom dir 111 | 112 | 113 | 1.0.3 (24-09-2014) 114 | ------------------ 115 | 116 | - Removed config_generators module to it's own package: conn-check-configs 117 | 118 | 119 | 1.0.2 (22-09-2014) 120 | ------------------ 121 | 122 | - Added a script to auto-generate conn-check config YAML from a Django settings module 123 | 124 | 125 | 1.0.1 (18-09-2014) 126 | ------------------ 127 | 128 | - Trivial release to fix setup.py tags 129 | 130 | 131 | 1.0.0 (18-09-2014) 132 | ------------------ 133 | 134 | - Initial release 135 | - Broken free of UbuntuOne 136 | - Nagios compatible output 137 | - YAML configuration 138 | -------------------------------------------------------------------------------- /conn_check/utils/convert_fw_rules.py: -------------------------------------------------------------------------------- 1 | from argparse import ArgumentParser 2 | from netaddr import IPNetwork 3 | from socket import gethostbyname 4 | import sys 5 | import yaml 6 | 7 | 8 | COMMANDS = { 9 | 'aws': ('aws ec2 authorize-security-group-egress --group-id {group}' 10 | ' --protocol {protocol} --port {port} --cidr {to_cidr}'), 11 | 'neutron': ('neutron security-group-rule-create --direction egress' 12 | ' --ethertype {ip_version} --protocol {protocol} ' 13 | ' --port-range-min {port} --port-range-max {port} ' 14 | ' --remote-ip-prefix {to_cidr} {group}'), 15 | 'nova': ('nova secgroup-add-rule {group} {protocol} {port} {port}' 16 | ' {to_cidr}'), 17 | 'iptables': ('iptables -A FORWARD -p {protocol} --dport {port}' 18 | ' -s {from_host} -d {to_host} -j ACCEPT'), 19 | 'ufw': ('ufw allow proto {protocol} from any to {to_cidr} port {port}'), 20 | } 21 | 22 | COMMAND_ALIASES = { 23 | 'amazon': 'aws', 24 | 'ec2': 'aws', 25 | 'openstack': 'neutron', 26 | 'os': 'neutron', 27 | } 28 | 29 | SECGROUP_COMMANDS = ('aws', 'neutron', 'nova') 30 | 31 | 32 | def merge_yaml(paths, use_from=False): 33 | """Merge multiple firewall YAML rule files with hosts/ports de-duped.""" 34 | merged_rules = {} 35 | 36 | for path in paths: 37 | rules = yaml.load(open(path)) 38 | 39 | for rule in rules.get('egress', []): 40 | from_ip = IPNetwork(gethostbyname(rule['from_host'])) 41 | to_ip = IPNetwork(gethostbyname(rule['to_host'])) 42 | 43 | # We need these values for de-duping, so we may as well add them 44 | # into the rule here to use later when generating commands. 45 | rule['to_cidr'] = str(to_ip.cidr) 46 | rule['ip_version'] = to_ip.version 47 | 48 | if use_from: 49 | rule['from_cidr'] = str(from_ip.cidr) 50 | key_template = '{protocol}:{from_cidr}:{to_cidr}' 51 | else: 52 | key_template = '{protocol}:{to_cidr}' 53 | key = key_template.format(**rule) 54 | 55 | if key not in merged_rules: 56 | merged_rules[key] = rule 57 | else: 58 | ports = merged_rules[key]['ports'] 59 | ports_diff = set(ports).difference(rule['ports']) 60 | 61 | if ports_diff: 62 | merged_rules[key]['ports'] = ports + ports_diff 63 | 64 | return merged_rules.values() 65 | 66 | 67 | def generate_commands(cmd, rules, group=None): 68 | """Generate firewall client commands from conn-check firewall rules.""" 69 | output = [] 70 | for rule in rules: 71 | for port in rule['ports']: 72 | params = { 73 | 'group': group or '$SECGROUP', 74 | 'port': port, 75 | } 76 | params.update(rule) 77 | output.append(COMMANDS[cmd].format(**params)) 78 | 79 | return output 80 | 81 | 82 | def run(*args): 83 | parser = ArgumentParser() 84 | parser.add_argument('-t', '--type', dest='output_type', required=True, 85 | help="Rules output type, e.g. neutron, nova, aws," 86 | " iptables, ufw.") 87 | parser.add_argument("paths", nargs='+', 88 | help="Paths to YAML files to combine/parse.") 89 | parser.add_argument('--group', dest='group', required=False, 90 | help="AWS security group ID or OpenStack Neutron group" 91 | " name.") 92 | options = parser.parse_args(list(args)) 93 | 94 | available_commands = COMMANDS.keys() + COMMAND_ALIASES.keys() 95 | output_type = options.output_type.lower() 96 | if output_type not in available_commands: 97 | sys.stderr.write('Error: invalid output type ({})\n'.format( 98 | options.output_type)) 99 | return 1 100 | 101 | command_type = COMMAND_ALIASES.get(output_type, output_type) 102 | rules = merge_yaml(options.paths, (command_type in SECGROUP_COMMANDS)) 103 | output_rules = generate_commands(command_type, rules, options.group) 104 | 105 | sys.stdout.write('{}\n'.format('\n'.join(output_rules))) 106 | return 0 107 | 108 | 109 | def main(): 110 | sys.exit(run(*sys.argv[1:])) 111 | 112 | 113 | if __name__ == '__main__': 114 | main() 115 | -------------------------------------------------------------------------------- /docs/tutorial-part-3.rst: -------------------------------------------------------------------------------- 1 | Tutorial Part 3: Adding conn-check to Juju deployed services 2 | ============================================================ 3 | 4 | Juju 5 | ---- 6 | 7 | `Juju `_ is an open source service orientated 8 | framework and deployment toolset from Canonical, given conn-check is also by 9 | Canonical you might expect there is an easy yet flexible way to add conn-check 10 | to your Juju environment. 11 | 12 | You'd be right… 13 | 14 | Adding conn-check charm support to your apps charm 15 | -------------------------------------------------- 16 | 17 | The `conn-check charm `_ 18 | is a subordinate charm that can be added alongside your applications charm, 19 | and will install/configure conn-check on your application units. 20 | 21 | To enable support for the conn-check subordinate in your applications charm 22 | you need to implement the ``conn-check-relation-changed`` hook, e.g.: 23 | 24 | .. code-block:: bash 25 | 26 | #!/bin/bash 27 | set -e 28 | CONFIG_PATH=/var/conn-check.yaml 29 | 30 | juju-log "Writing conn-check config to ${CONFIG_PATH}" 31 | /path/to/hwaas/settings-to-conn-check.py -f ${CONFIG_PATH} -m hwaas.settings 32 | 33 | # Ensure conn-check and nagios can both access the config file 34 | chown conn-check:nagios ${CONFIG_PATH} 35 | chmod 0660 ${CONFIG_PATH} 36 | 37 | # Set the config path, we could also tell the conn-check charm 38 | # to write the config file for us by setting the "config" option 39 | # but this is deprecated in favour of writing the file ourselves 40 | # and setting "config_path" 41 | relation-set config_path="${CONFIG_PATH}" 42 | 43 | You may note that we set the user to ``conn-check`` and the group to ``nagios``, 44 | you can actually get away with just setting the group to ``nagios`` as this 45 | will give both conn-check and nagios access to the config file, but you might 46 | as well set the user anyway otherwise it's likely to be ``root``. 47 | 48 | You'll also need to tell Juju your charm provides the ``conn-check`` relation 49 | in your ``metadata.yaml``: 50 | 51 | .. code-block:: yaml 52 | 53 | provides: 54 | conn-check: 55 | interface: conn-check 56 | scope: container 57 | 58 | When deploying conn-check with your service you then deploy the subordinate, 59 | relate it to your service (you can also optionally set it as a :ref:`nagios` 60 | provider): 61 | 62 | .. code-block:: sh 63 | 64 | $ juju deploy cs:~ubuntuone-hackers/trusty/conn-check my-service-conn-check 65 | $ juju set my-service-conn-check revision=108 # pin to the rev of conn-check you want to use 66 | $ juju add-relation my-service my-service-conn-check 67 | 68 | 69 | .. _nagios: 70 | 71 | Nagios 72 | ------ 73 | 74 | The conn-check charm provides the ``nrpe-external-master`` relation which 75 | means it can act as a Nagios plugin executor, so if you have a Nagios 76 | master in your environment for monitoring then conn-check can be regularly 77 | run along with your other monitoring checks to ensure your environments 78 | connections are as you expect them to be. 79 | 80 | To set this up you need to relate the deployed subordinate to your servie nrpe: 81 | 82 | .. code-block:: sh 83 | 84 | $ # assuming something like: 85 | $ # juju deploy nagios nagios-master 86 | $ # juju deploy nrpe my-service-nrpe 87 | $ # juju add-relation my-service:monitors my-service-nrpe:monitors 88 | $ juju add-relation my-service-conn-check my-service-nrpe 89 | 90 | For more details on Juju and Nagios you can see 91 | `this handy blog post `_. 92 | 93 | Actions 94 | ------- 95 | 96 | To manually run conn-check on all units, or a single unit, you can use the 97 | supplied ``run-check`` and ``run-nagios-check`` actions: 98 | 99 | .. code-block:: sh 100 | 101 | $ # all checks on all units 102 | $ juju run --service my-service-conn-check 'actions/run-check' 103 | $ # all checks on just unit 0 104 | $ juju run --service my-service-conn-check/0 'actions/run-check' 105 | $ # nagios (not including no-nagios) checks on all units 106 | $ juju run --service my-service-conn-check 'actions/run-nagios-check' 107 | $ # nagios (not including no-nagios) checks on just unit 0 108 | $ juju run --service my-service-conn-check/0 'actions/run-nagios-check' 109 | 110 | **Note**: before Juju 1.21 there is a 111 | `bug `_ which prevents 112 | juju-run from working with subordinate charms, you can work around this with 113 | juju-ssh: 114 | 115 | .. code-block:: sh 116 | 117 | $ # all checks on just unit 0 118 | $ juju ssh my-service-conn-check/0 'juju-run my-service-conn-check/0 actions/run-check' 119 | -------------------------------------------------------------------------------- /Makefile: -------------------------------------------------------------------------------- 1 | ENV = virtualenv 2 | WHEELS_DIR = ./wheels 3 | WHEELS_BRANCH = lp:~ubuntuone-hackers/conn-check/wheels 4 | WHEELS_BRANCH_DIR = /tmp/conn-check-wheels 5 | CONN_CHECK_REVNO = $(shell bzr revno) 6 | CONN_CHECK_VERSION = $(shell cat conn_check/version.txt) 7 | CONN_CHECK_PPA = ppa:wesmason/conn-check 8 | DEBIAN_PYTHON_CACHE_DIR = debian/pythoncache 9 | DEBIAN_PYTHON_PACKAGES_FILTER = Twisted txAMQP pyOpenSSL pyasn1 PyYAML psycopg2 requests cffi pycparser six setuptools zope.interface pymongo 10 | HERE = $(patsubst %/,%,$(dir $(realpath $(lastword $(MAKEFILE_LIST))))) 11 | DOCS_DIR = $(HERE)/docs 12 | 13 | $(ENV): 14 | virtualenv $(ENV) 15 | 16 | build: $(ENV) 17 | $(ENV)/bin/pip install -r devel-requirements.txt 18 | $(ENV)/bin/python setup.py develop 19 | 20 | test: $(ENV) 21 | $(ENV)/bin/nosetests 22 | 23 | clean-wheels: 24 | -rm -r $(WHEELS_DIR) 25 | -rm -r $(WHEELS_BRANCH_DIR) 26 | 27 | clean-docs: 28 | -rm -r $(DOCS_DIR)/_build 29 | 30 | clean: clean-wheels clean-docs 31 | -rm -r $(ENV) 32 | -rm -r dist 33 | -rm -r $(DEBIAN_PYTHON_CACHE_DIR) 34 | -rm -r conn_check.egg-info 35 | find . -name "*.pyc" -delete 36 | 37 | install-debs: 38 | sudo xargs --arg-file deb-dependencies.txt apt-get install -y 39 | 40 | install-deb-pkg-debs: install-debs 41 | sudo apt-get install -y build-essential packaging-dev dh-make 42 | 43 | $(ENV)/bin/pip2tgz: $(ENV) 44 | $(ENV)/bin/pip install pip2pi 45 | 46 | build-deb-pip-cache: $(ENV)/bin/pip2tgz 47 | mkdir -p $(DEBIAN_PYTHON_CACHE_DIR) 48 | ls *requirements.txt | grep -vw 'devel\|test' | xargs -I{} \ 49 | cat {} | sort | uniq > debian-requirements-filtered.txt 50 | @echo '$(DEBIAN_PYTHON_PACKAGES_FILTER)' \ 51 | | tr " " "\n" \ 52 | | xargs -L 1 -I{} \ 53 | sed -i '/^{}/d' debian-requirements-filtered.txt 54 | $(ENV)/bin/pip2tgz $(DEBIAN_PYTHON_CACHE_DIR) -r debian-requirements-filtered.txt 55 | -rm debian-requirements-filtered.txt 56 | @echo 'Removing upstream Debian python-* packages from cache..' 57 | @echo '$(DEBIAN_PYTHON_PACKAGES_FILTER)' \ 58 | | tr " " "\n" \ 59 | | xargs -L 1 -I{} find $(DEBIAN_PYTHON_CACHE_DIR) -maxdepth 2 -name '{}*' \ 60 | | xargs rm -r 61 | $(ENV)/bin/dir2pi $(DEBIAN_PYTHON_CACHE_DIR) 62 | sed -i '/pythoncache/d' debian/source/include-binaries 63 | find debian/pythoncache -path "*.html" -prune -o -print >> debian/source/include-binaries 64 | 65 | ../conn-check_$(CONN_CHECK_VERSION).orig.tar.gz: 66 | $(ENV)/bin/python setup.py sdist 67 | cp dist/conn-check-$(CONN_CHECK_VERSION).tar.gz ../conn-check_$(CONN_CHECK_VERSION).orig.tar.gz 68 | 69 | check-deb-version: 70 | @(grep -F '($(CONN_CHECK_VERSION)-' debian/changelog) || \ 71 | (echo 'ERROR: $(CONN_CHECK_VERSION) not found in debian/changelog, have you added it?' && exit 1) 72 | 73 | build-deb: check-deb-version build-deb-pip-cache ../conn-check_$(CONN_CHECK_VERSION).orig.tar.gz 74 | -rm ../conn-check_$(CONN_CHECK_VERSION)-* 75 | debuild -S -sa 76 | 77 | test-build-deb: build-deb 78 | debuild 79 | 80 | update-ppa: 81 | cd .. && dput $(CONN_CHECK_PPA) conn-check_$(CONN_CHECK_VERSION)-*_source.changes 82 | 83 | cmd: 84 | @echo $(ENV)/bin/conn-check 85 | 86 | fw-cmd: 87 | @echo $(ENV)/bin/conn-check-export-fw 88 | 89 | fw-convert-cmd: 90 | @echo $(ENV)/bin/conn-check-convert-fw 91 | 92 | pip-wheel: $(ENV) 93 | @$(ENV)/bin/pip install wheel 94 | 95 | $(WHEELS_DIR): 96 | mkdir $(WHEELS_DIR) 97 | 98 | build-wheels: pip-wheel $(WHEELS_DIR) $(ENV) 99 | $(ENV)/bin/pip wheel --wheel-dir=$(WHEELS_DIR) . 100 | 101 | build-wheels-extra: pip-wheel $(WHEELS_DIR) $(ENV) 102 | $(ENV)/bin/pip wheel --wheel-dir=$(WHEELS_DIR) -r ${EXTRA}-requirements.txt 103 | 104 | build-wheels-all-extras: pip-wheel $(WHEELS_DIR) $(ENV) 105 | ls *-requirements.txt | grep -vw 'devel\|test' | xargs -L 1 \ 106 | $(ENV)/bin/pip wheel --wheel-dir=$(WHEELS_DIR) -r 107 | 108 | test-wheels: build-wheels build-wheels-all-extras 109 | $(ENV)/bin/pip install -r test-requirements.txt 110 | $(ENV)/bin/pip install --upgrade --no-index --find-links $(WHEELS_DIR) -r requirements.txt 111 | ls *-requirements.txt | grep -vw 'devel\|test' | xargs -L 1 \ 112 | $(ENV)/bin/pip install --ignore-installed --no-index --find-links $(WHEELS_DIR) -r 113 | $(MAKE) test 114 | 115 | $(WHEELS_BRANCH_DIR): 116 | bzr checkout --lightweight $(WHEELS_BRANCH) $(WHEELS_BRANCH_DIR) 117 | 118 | update-wheel-branch: clean-wheels $(WHEELS_BRANCH_DIR) 119 | @$(ENV)/bin/pip install --upgrade setuptools 120 | @$(ENV)/bin/pip install --upgrade pip 121 | WHEELS_BRANCH=$(WHEELS_BRANCH) \ 122 | WHEELS_BRANCH_DIR=$(WHEELS_BRANCH_DIR) \ 123 | CONN_CHECK_REVNO=$(CONN_CHECK_REVNO) \ 124 | WHEELS_DIR=$(WHEELS_DIR) \ 125 | $(PWD)/build_scripts/update_wheels_branch.sh 126 | 127 | upload: build test pip-wheel 128 | $(ENV)/bin/python setup.py sdist bdist_wheel upload 129 | @echo 130 | @echo "Don't forget: bzr tag $(CONN_CHECK_VERSION) && bzr push" 131 | 132 | docs: TYPE=html 133 | docs: 134 | cd $(DOCS_DIR) && $(MAKE) $(TYPE) 135 | 136 | update-rtd: 137 | -curl -X POST http://readthedocs.org/build/conn-check 138 | 139 | 140 | .PHONY: test build pip-wheel build-wheels build-wheels-extra build-wheels-all test-wheels install-debs clean cmd upload install-build-debs build-deb-pip-cache test-build-deb docs clean-docs update-rtd check-deb-version 141 | .DEFAULT_GOAL := test 142 | -------------------------------------------------------------------------------- /conn_check/patterns.py: -------------------------------------------------------------------------------- 1 | import re 2 | 3 | 4 | class Pattern(object): 5 | """Abstract base class for patterns used to select subsets of checks.""" 6 | 7 | def assume_prefix(self, prefix): 8 | """Return an equivalent pattern with the given prefix baked in. 9 | 10 | For example, if self.matches("bar") is True, then 11 | self.assume_prefix("foo").matches("foobar") will be True. 12 | """ 13 | return PrefixPattern(prefix, self) 14 | 15 | def failed(self): 16 | """Return True if the pattern cannot match any string. 17 | 18 | This is mainly used so we can bail out early when recursing into 19 | check trees. 20 | """ 21 | return not self.prefix_matches("") 22 | 23 | def prefix_matches(self, partial_name): 24 | """Return True if the partial name (a prefix) is a potential match.""" 25 | raise NotImplementedError("{}.prefix_matches not " 26 | "implemented".format(type(self))) 27 | 28 | def matches(self, name): 29 | """Return True if the given name matches.""" 30 | raise NotImplementedError("{}.match not " 31 | "implemented".format(type(self))) 32 | 33 | 34 | class FailedPattern(Pattern): 35 | """Patterns that always fail to match.""" 36 | 37 | def assume_prefix(self, prefix): 38 | """Return an equivalent pattern with the given prefix baked in.""" 39 | return FAILED_PATTERN 40 | 41 | def prefix_matches(self, partial_name): 42 | """Return True if the partial name matches.""" 43 | return False 44 | 45 | def matches(self, name): 46 | """Return True if the complete name matches.""" 47 | return False 48 | 49 | 50 | FAILED_PATTERN = FailedPattern() 51 | 52 | 53 | PATTERN_TOKEN_RE = re.compile(r'\*|[^*]+') 54 | 55 | 56 | def tokens_to_partial_re(tokens): 57 | """Convert tokens to a regular expression for matching prefixes.""" 58 | 59 | def token_to_re(token): 60 | """Convert tokens to (begin, end, alt_end) triples.""" 61 | if token == '*': 62 | return (r'(?:.*', ')?', ')') 63 | else: 64 | chars = list(token) 65 | begin = "".join(["(?:" + re.escape(c) for c in chars]) 66 | end = "".join([")?" for c in chars]) 67 | return (begin, end, end) 68 | 69 | subexprs = map(token_to_re, tokens) 70 | if len(subexprs) > 0: 71 | # subexpressions like (.*)? aren't accepted, so we may have to use 72 | # an alternate closing form for the last (innermost) subexpression 73 | (begin, _, alt_end) = subexprs[-1] 74 | subexprs[-1] = (begin, alt_end, alt_end) 75 | return re.compile("".join([se[0] for se in subexprs] + 76 | [se[1] for se in reversed(subexprs)] + 77 | [r'\Z'])) 78 | 79 | 80 | def tokens_to_re(tokens): 81 | """Convert tokens to a regular expression for exact matching.""" 82 | 83 | def token_to_re(token): 84 | """Convert tokens to simple regular expressions.""" 85 | if token == '*': 86 | return r'.*' 87 | else: 88 | return re.escape(token) 89 | 90 | return re.compile("".join(map(token_to_re, tokens) + [r'\Z'])) 91 | 92 | 93 | class SimplePattern(Pattern): 94 | """Pattern that matches according to the given pattern expression.""" 95 | 96 | def __init__(self, pattern): 97 | """Initialize an instance.""" 98 | super(SimplePattern, self).__init__() 99 | tokens = PATTERN_TOKEN_RE.findall(pattern) 100 | self.partial_re = tokens_to_partial_re(tokens) 101 | self.full_re = tokens_to_re(tokens) 102 | 103 | def prefix_matches(self, partial_name): 104 | """Return True if the partial name matches.""" 105 | return self.partial_re.match(partial_name) is not None 106 | 107 | def matches(self, name): 108 | """Return True if the complete name matches.""" 109 | return self.full_re.match(name) is not None 110 | 111 | 112 | class PrefixPattern(Pattern): 113 | """Pattern that assumes a previously given prefix.""" 114 | 115 | def __init__(self, prefix, pattern): 116 | """Initialize an instance.""" 117 | super(PrefixPattern, self).__init__() 118 | self.prefix = prefix 119 | self.pattern = pattern 120 | 121 | def assume_prefix(self, prefix): 122 | """Return an equivalent pattern with the given prefix baked in.""" 123 | return PrefixPattern(self.prefix + prefix, self.pattern) 124 | 125 | def prefix_matches(self, partial_name): 126 | """Return True if the partial name matches.""" 127 | return self.pattern.prefix_matches(self.prefix + partial_name) 128 | 129 | def matches(self, name): 130 | """Return True if the complete name matches.""" 131 | return self.pattern.matches(self.prefix + name) 132 | 133 | 134 | class SumPattern(Pattern): 135 | """Pattern that matches if at least one given pattern matches.""" 136 | 137 | def __init__(self, patterns): 138 | """Initialize an instance.""" 139 | super(SumPattern, self).__init__() 140 | self.patterns = patterns 141 | 142 | def prefix_matches(self, partial_name): 143 | """Return True if the partial name matches.""" 144 | for pattern in self.patterns: 145 | if pattern.prefix_matches(partial_name): 146 | return True 147 | return False 148 | 149 | def matches(self, name): 150 | """Return True if the complete name matches.""" 151 | for pattern in self.patterns: 152 | if pattern.matches(name): 153 | return True 154 | return False 155 | -------------------------------------------------------------------------------- /docs/tutorial-part-1.rst: -------------------------------------------------------------------------------- 1 | Tutorial Part 1: Checking connections for a basic web app 2 | ========================================================= 3 | 4 | Hello World 5 | ----------- 6 | 7 | Suppose you have the basic webapp *HWaaS* (Hello World as a Service, naturally). 8 | 9 | It returns a different translation of "Hello World" on every request, and 10 | accepts new translations via ``POST`` requests. 11 | 12 | * The translations are stored in a *PostgreSQL* database. 13 | * *memcached* is used to keep a cache of pre-rendered "Hello World" 14 | HTML pages. 15 | * Optionally requests are sent to the 16 | `Google Translate API `_ to get an 17 | automatically translated version of the page in the user's language 18 | if they push a certain button and a translation in their language isn't 19 | available in the *PostgreSQL* DB. 20 | * The *Squid* HTTP proxy is sat between it and the Translate API to cache requests 21 | (varied by language), to avoid hitting Google's rate limiting. 22 | 23 | 24 | Why use conn-check? 25 | ------------------- 26 | 27 | Our *HWaaS* example service depends on not only 3 internal services, but also 28 | a completely external service (the Google Translate API), and any number of 29 | issues from network routing, firewall configuration and bad service 30 | configuration to external outages could cause issues after a new deployment 31 | (or at any time really, but we'll address that later in :ref:`nagios`). 32 | 33 | *conn-check* can verify connections to these dependencies using not just basic 34 | TCP/UDP connects, but also service specific ones, with authentication where 35 | needed, timeouts, and even permissions (e.g. can *user A* access 36 | *DB schema B*). 37 | 38 | Yet another YAML file 39 | --------------------- 40 | 41 | conn-check is configured using a `YAML `_ file containing 42 | a list of checks to perform in parallel (by default, but this too is 43 | configurable with a CLI option). 44 | 45 | Here's an example file (it could be called ``hwaas-cc.yaml``): 46 | 47 | .. code-block:: yaml 48 | 49 | - type: postgresql 50 | host: gibson.hwaas.internal 51 | port: 5432 52 | username: hwaas 53 | password: 123456asdf 54 | database: hwaas_production 55 | - type: memcached 56 | host: freeside.hwaas.internal 57 | port: 11211 58 | - type: http 59 | url: https://www.googleapis.com/language/translate/v2?q=Hello%20World&target=de&source=en&key=BLAH 60 | proxy_host: countzero.hwaas.internal 61 | proxy_port: 8080 62 | expected_code: 200 63 | 64 | Let's examine those checks.. 65 | ---------------------------- 66 | 67 | PostgreSQL 68 | `````````` 69 | 70 | .. code-block:: yaml 71 | 72 | - type: postgresql 73 | host: gibson.hwaas.internal 74 | port: 5432 75 | username: hwaas 76 | password: 123456asdf 77 | database: hwaas_production 78 | 79 | *type*: This one doesn't require much explanation, except the fact that you 80 | can use either `postgresql`` or ``postgres`` (many checks have aliases), :doc:`see the readme `.. 81 | 82 | *host*, *port*: The host to connect to is always, understandably, required, 83 | but if not supplied the default psql port of ``5432`` will be used. 84 | 85 | *username*, *password*: Auth details are required and important when used with… 86 | 87 | …*database*: This is the psql schema to attempt to switch to use, and 88 | *username* has permission to access. 89 | 90 | memcached 91 | ````````` 92 | 93 | .. code-block:: yaml 94 | 95 | - type: memcached 96 | host: freeside.hwaas.internal 97 | port: 11211 98 | 99 | *type*: ``memcache`` or ``memcached`` are valid, :doc:`see the readme `. 100 | 101 | *host*, *port*: If port isn't supplied the memcached default ``11211`` is used 102 | instead. 103 | 104 | HTTP 105 | ```` 106 | 107 | .. code-block:: yaml 108 | 109 | - type: http 110 | url: https://www.googleapis.com/language/translate/v2?q=Hello%20World&target=de&source=en&key=BLAH 111 | proxy_host: countzero.hwaas.internal 112 | proxy_port: 8080 113 | expected_code: 200 114 | 115 | *type*: ``http`` or ``https`` are valid, :doc:`see the readme `. 116 | 117 | *url*: As we're doing a simple GET to the Translate API I've included the 118 | ``key`` in the querystring, but you could also include auth defailts as HTTP 119 | headers using the ``headers`` check option. 120 | 121 | *proxy_host*, *proxy_port*: We supply the host/port to our Squid proxy here, 122 | we could also use the ``proxy_url`` check option instead to define the proxy 123 | as a standard HTTP URL (makes it possible to define a HTTPS proxy). 124 | 125 | *expected_code*: This is the `status code `_ 126 | we expect to get back from the service if the request was successful, anything 127 | other than ``200`` in this case will cause the check to fail. 128 | 129 | .. _nagios: 130 | 131 | Using conn-check with Nagios 132 | ---------------------------- 133 | 134 | conn-check output tries to stay as close as possible to the 135 | `Nagios plugin guidelines `_ 136 | so that it can be used as a regular `Nagios `_ check 137 | for more constant monitoring of your service deployment (not just ad-hoc at 138 | deploy time). 139 | 140 | Example NRPE config files, assuming ``conn-check`` is system installed:: 141 | 142 | # /etc/nagios/nrpe.d/check_conn_check.cfg 143 | command[conn_check]=/usr/bin/conn-check --max-timeout=10 --exclude-tags=no-nagios /var/conn-check/hwaas-cc.yaml 144 | 145 | 146 | # /var/lib/nagios/export/service__hwaas_conn_check.cfg 147 | define service { 148 | use active-service 149 | host_name hwaas-web1.internal 150 | service_description connection checks with conn-check 151 | check_command check_nrpe!conn_check 152 | servicegroups web,hwaas 153 | } 154 | 155 | A few arguments to note: 156 | 157 | ``--max-timeout=10``: This sets the global timeout to 10 seconds, which means 158 | it will error if the total time for all checks combined goes above 10s, which 159 | is the default max time allowed by Nagios for a plugin to run. 160 | 161 | This way we still get all the individual check results back even if one of them 162 | went above the threshold. 163 | 164 | 165 | ``--exclude-tags=no-nagios``: Although optional, this allows you to exclude 166 | any check tagged with ``no-nagios``, which is especially handy for checks to 167 | external/third-party services that you don't want to be hit constantly 168 | by Nagios. 169 | 170 | For example if we didn't want Nagios to hit Google every few minutes: 171 | 172 | .. code-block:: yaml 173 | 174 | - type: http 175 | url: https://www.googleapis.com/language/translate/v2?q=Hello%20World&target=de&source=en&key=BLAH 176 | proxy_host: countzero.hwaas.internal 177 | proxy_port: 8080 178 | expected_code: 200 179 | tags: [no-nagios] 180 | -------------------------------------------------------------------------------- /docs/Makefile: -------------------------------------------------------------------------------- 1 | # Makefile for Sphinx documentation 2 | # 3 | 4 | # You can set these variables from the command line. 5 | SPHINXOPTS = 6 | SPHINXBUILD = sphinx-build 7 | PAPER = 8 | BUILDDIR = _build 9 | 10 | # User-friendly check for sphinx-build 11 | ifeq ($(shell which $(SPHINXBUILD) >/dev/null 2>&1; echo $$?), 1) 12 | $(error The '$(SPHINXBUILD)' command was not found. Make sure you have Sphinx installed, then set the SPHINXBUILD environment variable to point to the full path of the '$(SPHINXBUILD)' executable. Alternatively you can add the directory with the executable to your PATH. If you don't have Sphinx installed, grab it from http://sphinx-doc.org/) 13 | endif 14 | 15 | # Internal variables. 16 | PAPEROPT_a4 = -D latex_paper_size=a4 17 | PAPEROPT_letter = -D latex_paper_size=letter 18 | ALLSPHINXOPTS = -d $(BUILDDIR)/doctrees $(PAPEROPT_$(PAPER)) $(SPHINXOPTS) . 19 | # the i18n builder cannot share the environment and doctrees with the others 20 | I18NSPHINXOPTS = $(PAPEROPT_$(PAPER)) $(SPHINXOPTS) . 21 | 22 | .PHONY: help clean html dirhtml singlehtml pickle json htmlhelp qthelp devhelp epub latex latexpdf text man changes linkcheck doctest gettext 23 | 24 | help: 25 | @echo "Please use \`make ' where is one of" 26 | @echo " html to make standalone HTML files" 27 | @echo " dirhtml to make HTML files named index.html in directories" 28 | @echo " singlehtml to make a single large HTML file" 29 | @echo " pickle to make pickle files" 30 | @echo " json to make JSON files" 31 | @echo " htmlhelp to make HTML files and a HTML help project" 32 | @echo " qthelp to make HTML files and a qthelp project" 33 | @echo " devhelp to make HTML files and a Devhelp project" 34 | @echo " epub to make an epub" 35 | @echo " latex to make LaTeX files, you can set PAPER=a4 or PAPER=letter" 36 | @echo " latexpdf to make LaTeX files and run them through pdflatex" 37 | @echo " latexpdfja to make LaTeX files and run them through platex/dvipdfmx" 38 | @echo " text to make text files" 39 | @echo " man to make manual pages" 40 | @echo " texinfo to make Texinfo files" 41 | @echo " info to make Texinfo files and run them through makeinfo" 42 | @echo " gettext to make PO message catalogs" 43 | @echo " changes to make an overview of all changed/added/deprecated items" 44 | @echo " xml to make Docutils-native XML files" 45 | @echo " pseudoxml to make pseudoxml-XML files for display purposes" 46 | @echo " linkcheck to check all external links for integrity" 47 | @echo " doctest to run all doctests embedded in the documentation (if enabled)" 48 | 49 | clean: 50 | rm -rf $(BUILDDIR)/* 51 | 52 | html: 53 | $(SPHINXBUILD) -b html $(ALLSPHINXOPTS) $(BUILDDIR)/html 54 | @echo 55 | @echo "Build finished. The HTML pages are in $(BUILDDIR)/html." 56 | 57 | dirhtml: 58 | $(SPHINXBUILD) -b dirhtml $(ALLSPHINXOPTS) $(BUILDDIR)/dirhtml 59 | @echo 60 | @echo "Build finished. The HTML pages are in $(BUILDDIR)/dirhtml." 61 | 62 | singlehtml: 63 | $(SPHINXBUILD) -b singlehtml $(ALLSPHINXOPTS) $(BUILDDIR)/singlehtml 64 | @echo 65 | @echo "Build finished. The HTML page is in $(BUILDDIR)/singlehtml." 66 | 67 | pickle: 68 | $(SPHINXBUILD) -b pickle $(ALLSPHINXOPTS) $(BUILDDIR)/pickle 69 | @echo 70 | @echo "Build finished; now you can process the pickle files." 71 | 72 | json: 73 | $(SPHINXBUILD) -b json $(ALLSPHINXOPTS) $(BUILDDIR)/json 74 | @echo 75 | @echo "Build finished; now you can process the JSON files." 76 | 77 | htmlhelp: 78 | $(SPHINXBUILD) -b htmlhelp $(ALLSPHINXOPTS) $(BUILDDIR)/htmlhelp 79 | @echo 80 | @echo "Build finished; now you can run HTML Help Workshop with the" \ 81 | ".hhp project file in $(BUILDDIR)/htmlhelp." 82 | 83 | qthelp: 84 | $(SPHINXBUILD) -b qthelp $(ALLSPHINXOPTS) $(BUILDDIR)/qthelp 85 | @echo 86 | @echo "Build finished; now you can run "qcollectiongenerator" with the" \ 87 | ".qhcp project file in $(BUILDDIR)/qthelp, like this:" 88 | @echo "# qcollectiongenerator $(BUILDDIR)/qthelp/ReadtheDocsTemplate.qhcp" 89 | @echo "To view the help file:" 90 | @echo "# assistant -collectionFile $(BUILDDIR)/qthelp/ReadtheDocsTemplate.qhc" 91 | 92 | devhelp: 93 | $(SPHINXBUILD) -b devhelp $(ALLSPHINXOPTS) $(BUILDDIR)/devhelp 94 | @echo 95 | @echo "Build finished." 96 | @echo "To view the help file:" 97 | @echo "# mkdir -p $$HOME/.local/share/devhelp/ReadtheDocsTemplate" 98 | @echo "# ln -s $(BUILDDIR)/devhelp $$HOME/.local/share/devhelp/ReadtheDocsTemplate" 99 | @echo "# devhelp" 100 | 101 | epub: 102 | $(SPHINXBUILD) -b epub $(ALLSPHINXOPTS) $(BUILDDIR)/epub 103 | @echo 104 | @echo "Build finished. The epub file is in $(BUILDDIR)/epub." 105 | 106 | latex: 107 | $(SPHINXBUILD) -b latex $(ALLSPHINXOPTS) $(BUILDDIR)/latex 108 | @echo 109 | @echo "Build finished; the LaTeX files are in $(BUILDDIR)/latex." 110 | @echo "Run \`make' in that directory to run these through (pdf)latex" \ 111 | "(use \`make latexpdf' here to do that automatically)." 112 | 113 | latexpdf: 114 | $(SPHINXBUILD) -b latex $(ALLSPHINXOPTS) $(BUILDDIR)/latex 115 | @echo "Running LaTeX files through pdflatex..." 116 | $(MAKE) -C $(BUILDDIR)/latex all-pdf 117 | @echo "pdflatex finished; the PDF files are in $(BUILDDIR)/latex." 118 | 119 | latexpdfja: 120 | $(SPHINXBUILD) -b latex $(ALLSPHINXOPTS) $(BUILDDIR)/latex 121 | @echo "Running LaTeX files through platex and dvipdfmx..." 122 | $(MAKE) -C $(BUILDDIR)/latex all-pdf-ja 123 | @echo "pdflatex finished; the PDF files are in $(BUILDDIR)/latex." 124 | 125 | text: 126 | $(SPHINXBUILD) -b text $(ALLSPHINXOPTS) $(BUILDDIR)/text 127 | @echo 128 | @echo "Build finished. The text files are in $(BUILDDIR)/text." 129 | 130 | man: 131 | $(SPHINXBUILD) -b man $(ALLSPHINXOPTS) $(BUILDDIR)/man 132 | @echo 133 | @echo "Build finished. The manual pages are in $(BUILDDIR)/man." 134 | 135 | texinfo: 136 | $(SPHINXBUILD) -b texinfo $(ALLSPHINXOPTS) $(BUILDDIR)/texinfo 137 | @echo 138 | @echo "Build finished. The Texinfo files are in $(BUILDDIR)/texinfo." 139 | @echo "Run \`make' in that directory to run these through makeinfo" \ 140 | "(use \`make info' here to do that automatically)." 141 | 142 | info: 143 | $(SPHINXBUILD) -b texinfo $(ALLSPHINXOPTS) $(BUILDDIR)/texinfo 144 | @echo "Running Texinfo files through makeinfo..." 145 | make -C $(BUILDDIR)/texinfo info 146 | @echo "makeinfo finished; the Info files are in $(BUILDDIR)/texinfo." 147 | 148 | gettext: 149 | $(SPHINXBUILD) -b gettext $(I18NSPHINXOPTS) $(BUILDDIR)/locale 150 | @echo 151 | @echo "Build finished. The message catalogs are in $(BUILDDIR)/locale." 152 | 153 | changes: 154 | $(SPHINXBUILD) -b changes $(ALLSPHINXOPTS) $(BUILDDIR)/changes 155 | @echo 156 | @echo "The overview file is in $(BUILDDIR)/changes." 157 | 158 | linkcheck: 159 | $(SPHINXBUILD) -b linkcheck $(ALLSPHINXOPTS) $(BUILDDIR)/linkcheck 160 | @echo 161 | @echo "Link check complete; look for any errors in the above output " \ 162 | "or in $(BUILDDIR)/linkcheck/output.txt." 163 | 164 | doctest: 165 | $(SPHINXBUILD) -b doctest $(ALLSPHINXOPTS) $(BUILDDIR)/doctest 166 | @echo "Testing of doctests in the sources finished, look at the " \ 167 | "results in $(BUILDDIR)/doctest/output.txt." 168 | 169 | xml: 170 | $(SPHINXBUILD) -b xml $(ALLSPHINXOPTS) $(BUILDDIR)/xml 171 | @echo 172 | @echo "Build finished. The XML files are in $(BUILDDIR)/xml." 173 | 174 | pseudoxml: 175 | $(SPHINXBUILD) -b pseudoxml $(ALLSPHINXOPTS) $(BUILDDIR)/pseudoxml 176 | @echo 177 | @echo "Build finished. The pseudo-XML files are in $(BUILDDIR)/pseudoxml." 178 | -------------------------------------------------------------------------------- /docs/tutorial-part-2.rst: -------------------------------------------------------------------------------- 1 | Tutorial Part 2: Auto-generating conn-check config for a Django app 2 | =================================================================== 3 | 4 | Hello World (again) 5 | ------------------- 6 | 7 | Let's assume that you've actually created the ``Hello World`` service from 8 | :doc:`part 1 ` as a 9 | `Django app `_, and you think to yourself: 10 | 11 | *"Hang on, aren't all these connections I want conn-check to check for me 12 | already defined in my Django settings module?"* 13 | 14 | conn-check-configs 15 | ------------------ 16 | 17 | Yes, yes they are, and with the handy-dandy 18 | `conn-check-configs `_ 19 | package you can automatically generate conn-check config YAML from a range of 20 | standard Django settings values (in theory from other environments 21 | too, such as `Juju `_, but for now just Django). 22 | 23 | exempli gratia 24 | -------------- 25 | 26 | Given the following ``settings.py`` in our *HWaaS* app: 27 | 28 | .. code-block:: python 29 | 30 | INSTALLED_APPS = [ 31 | 'hwaas' 32 | ] 33 | DATABASES = { 34 | 'default': { 35 | 'ENGINE': 'django.db.backends.postgresql_psycopg2', 36 | 'HOST': 'gibson.hwass.internal', 37 | 'NAME': 'hwaas_production', 38 | 'PASSWORD': '123456asdf', 39 | 'PORT': 11211, 40 | 'USER': 'hwaas', 41 | } 42 | CACHES = { 43 | 'default': { 44 | 'LOCATION': 'freeside.hwaas.internal:11211', 45 | 'BACKEND': 'django.core.cache.backends.memcached.MemcachedCache', 46 | }, 47 | } 48 | PROXY_HOST = 'countzero.hwaas.internal' 49 | PROXY_PORT = 8080 50 | TRANSLATE_API_KEY = 'BLAH' 51 | 52 | We can create a ``settings-to-conn-check.py`` script with the least possible 53 | effort like so: 54 | 55 | .. code-block:: python 56 | 57 | #!/usr/bin/env python 58 | from conn_check_configs.django import run 59 | 60 | if __name__ == '__main__': 61 | run() 62 | 63 | This will output *postgresql* and *memcached* checks to similar our 64 | hand-written config: 65 | 66 | .. code-block:: sh 67 | 68 | $ chmod +x settings-to-conn-check.py 69 | $ ./settings-to-conn-check.py -f cc-config.yaml -m hwaas.settings 70 | $ cat cc-config.yaml 71 | 72 | .. code-block:: yaml 73 | 74 | - type: postgresql 75 | database: hwaas_production 76 | host: gibson.hwaas.internal 77 | port: 5432 78 | username: hwaas 79 | password: 123456asdf 80 | - type: memcached 81 | host: freeside.hwaas.internal 82 | port: 11211 83 | 84 | Customising generated checks 85 | ---------------------------- 86 | 87 | In order to generate the checks we need for Squid / Google Translate API, we 88 | can add some custom callbacks: 89 | 90 | .. code-block:: python 91 | 92 | #!/usr/bin/env python 93 | from conn_check_configs.django import run, EXTRA_CHECK_MAKERS 94 | 95 | 96 | def make_proxied_translate_check(settings, options): 97 | checks = [] 98 | if settings['PROXY_HOST']: 99 | checks.append({ 100 | 'type': 'http', 101 | 'url': 'https://www.googleapis.com/language/translate/v2?q=' 102 | 'Hello%20World&target=de&source=en&key={}'.format( 103 | settings['TRANSLATE_API_KEY']), 104 | 'proxy_host': settings['PROXY_HOST'], 105 | 'proxy_port': int(settings.get('PROXY_PORT', 8080)), 106 | 'expected_code': 200, 107 | }) 108 | return checks 109 | 110 | EXTRA_CHECK_MAKERS.append(make_proxied_translate_check) 111 | 112 | 113 | if __name__ == '__main__': 114 | run() 115 | 116 | 117 | In the above we define a callable which takes 2 params, ``settings`` which 118 | is a wrapper around the Django settings module, and ``options`` which is 119 | an object containing the command line arguments that were passed to the script. 120 | 121 | The ``settings`` module is not the direct settings module but a dict-like 122 | wrapper so that you can access the settings just a like a dict (using indices, 123 | ``.get`` method, etc.) 124 | 125 | To ensure ``make_proxied_translate_check`` is collected and called by the main 126 | ``run`` function we add it to the ``EXTRA_CHECK_MAKERS`` list. 127 | 128 | The above generates our required HTTP check: 129 | 130 | .. code-block:: yaml 131 | 132 | - type: http 133 | url: https://www.googleapis.com/language/translate/v2?q=Hello%20World&target=de&source=en&key=BLAH 134 | proxy_host: countzero.hwaas.internal 135 | proxy_port: 8080 136 | expected_code: 200 137 | 138 | A note on statstd checks 139 | ------------------------ 140 | 141 | Getting more operational visibility on how *HWaaS* runs would be great, 142 | wouldn't it? 143 | 144 | So let's add some metrics collection using 145 | `StatsD `_, and as luck would have it we can 146 | get a lot for *nearly free* with the 147 | `django-statsd `_, after adding it to 148 | our dependencies we update our ``settings.py`` to include: 149 | 150 | .. code-block:: python 151 | 152 | INSTALLED_APPS = [ 153 | 'hwaas' 154 | 'django_statsd', 155 | ] 156 | MIDDLEWARE_CLASSES = [ 157 | 'django_statsd.middleware.GraphiteMiddleware', 158 | ] 159 | STATSD_CLIENT = 'django_statsd.clients.normal' 160 | STATSD_HOST = 'bigend.hwaas.internal' 161 | STATSD_PORT = 10021 162 | 163 | **Note**: You don't actually need the django-statsd app to have 164 | conn-check-configs generate statsd checks, only the use of ``STATSD_HOST`` 165 | and ``STATSD_PORT`` in your settings module matters. 166 | 167 | Another run of our ``settings-to-conn-check.py`` script will result in the 168 | extra statsd check: 169 | 170 | .. code-block:: yaml 171 | 172 | - type: udp 173 | host: bigend.hwaas.internal 174 | port: 10021 175 | send: conncheck.test:1|c 176 | expect: 177 | 178 | As you can see this is just a generic UDP check that attempts to send an 179 | incremental counter metric to the statsd host. 180 | 181 | Unfortunately the fire-and-forget nature of this use of statsd/UDP will not 182 | error in a number of common situations (the simplest being that statsd is not 183 | running on the target host, or even a routing issue along the way). 184 | 185 | It will catch simple problems such as not being able to open up the local UDP 186 | port to send from, but that's usually not enough. 187 | 188 | If you use a third-party implementation of statsd, such as 189 | `txStatsD `_ then you might have the ability 190 | to define a pair of health check strings, for example by changing the send 191 | and expect values in the ``STATSD_CHECK`` dict we can send and expect different 192 | strings: 193 | 194 | .. code-block:: python 195 | 196 | #!/usr/bin/env python 197 | from conn_check_configs.django import run, STATSD_CHECK 198 | 199 | STATSD_CHECK['send'] = 'Hakuna' 200 | STATSD_CHECK['expect'] = 'Matata' 201 | 202 | if __name__ == '__main__': 203 | run() 204 | 205 | Which generates this check: 206 | 207 | .. code-block:: yaml 208 | 209 | - type: udp 210 | host: bigend.hwaas.internal 211 | port: 10021 212 | send: Hakuna 213 | expect: Matata 214 | 215 | In the above we would configure our txStatD (for example) instance to respond 216 | to the string ``Hakuna`` with the string ``Matata``, which would catch pretty 217 | much all the possible issues with contacting our metrics service. 218 | -------------------------------------------------------------------------------- /docs/make.bat: -------------------------------------------------------------------------------- 1 | @ECHO OFF 2 | 3 | REM Command file for Sphinx documentation 4 | 5 | if "%SPHINXBUILD%" == "" ( 6 | set SPHINXBUILD=sphinx-build 7 | ) 8 | set BUILDDIR=_build 9 | set ALLSPHINXOPTS=-d %BUILDDIR%/doctrees %SPHINXOPTS% . 10 | set I18NSPHINXOPTS=%SPHINXOPTS% . 11 | if NOT "%PAPER%" == "" ( 12 | set ALLSPHINXOPTS=-D latex_paper_size=%PAPER% %ALLSPHINXOPTS% 13 | set I18NSPHINXOPTS=-D latex_paper_size=%PAPER% %I18NSPHINXOPTS% 14 | ) 15 | 16 | if "%1" == "" goto help 17 | 18 | if "%1" == "help" ( 19 | :help 20 | echo.Please use `make ^` where ^ is one of 21 | echo. html to make standalone HTML files 22 | echo. dirhtml to make HTML files named index.html in directories 23 | echo. singlehtml to make a single large HTML file 24 | echo. pickle to make pickle files 25 | echo. json to make JSON files 26 | echo. htmlhelp to make HTML files and a HTML help project 27 | echo. qthelp to make HTML files and a qthelp project 28 | echo. devhelp to make HTML files and a Devhelp project 29 | echo. epub to make an epub 30 | echo. latex to make LaTeX files, you can set PAPER=a4 or PAPER=letter 31 | echo. text to make text files 32 | echo. man to make manual pages 33 | echo. texinfo to make Texinfo files 34 | echo. gettext to make PO message catalogs 35 | echo. changes to make an overview over all changed/added/deprecated items 36 | echo. xml to make Docutils-native XML files 37 | echo. pseudoxml to make pseudoxml-XML files for display purposes 38 | echo. linkcheck to check all external links for integrity 39 | echo. doctest to run all doctests embedded in the documentation if enabled 40 | goto end 41 | ) 42 | 43 | if "%1" == "clean" ( 44 | for /d %%i in (%BUILDDIR%\*) do rmdir /q /s %%i 45 | del /q /s %BUILDDIR%\* 46 | goto end 47 | ) 48 | 49 | 50 | %SPHINXBUILD% 2> nul 51 | if errorlevel 9009 ( 52 | echo. 53 | echo.The 'sphinx-build' command was not found. Make sure you have Sphinx 54 | echo.installed, then set the SPHINXBUILD environment variable to point 55 | echo.to the full path of the 'sphinx-build' executable. Alternatively you 56 | echo.may add the Sphinx directory to PATH. 57 | echo. 58 | echo.If you don't have Sphinx installed, grab it from 59 | echo.http://sphinx-doc.org/ 60 | exit /b 1 61 | ) 62 | 63 | if "%1" == "html" ( 64 | %SPHINXBUILD% -b html %ALLSPHINXOPTS% %BUILDDIR%/html 65 | if errorlevel 1 exit /b 1 66 | echo. 67 | echo.Build finished. The HTML pages are in %BUILDDIR%/html. 68 | goto end 69 | ) 70 | 71 | if "%1" == "dirhtml" ( 72 | %SPHINXBUILD% -b dirhtml %ALLSPHINXOPTS% %BUILDDIR%/dirhtml 73 | if errorlevel 1 exit /b 1 74 | echo. 75 | echo.Build finished. The HTML pages are in %BUILDDIR%/dirhtml. 76 | goto end 77 | ) 78 | 79 | if "%1" == "singlehtml" ( 80 | %SPHINXBUILD% -b singlehtml %ALLSPHINXOPTS% %BUILDDIR%/singlehtml 81 | if errorlevel 1 exit /b 1 82 | echo. 83 | echo.Build finished. The HTML pages are in %BUILDDIR%/singlehtml. 84 | goto end 85 | ) 86 | 87 | if "%1" == "pickle" ( 88 | %SPHINXBUILD% -b pickle %ALLSPHINXOPTS% %BUILDDIR%/pickle 89 | if errorlevel 1 exit /b 1 90 | echo. 91 | echo.Build finished; now you can process the pickle files. 92 | goto end 93 | ) 94 | 95 | if "%1" == "json" ( 96 | %SPHINXBUILD% -b json %ALLSPHINXOPTS% %BUILDDIR%/json 97 | if errorlevel 1 exit /b 1 98 | echo. 99 | echo.Build finished; now you can process the JSON files. 100 | goto end 101 | ) 102 | 103 | if "%1" == "htmlhelp" ( 104 | %SPHINXBUILD% -b htmlhelp %ALLSPHINXOPTS% %BUILDDIR%/htmlhelp 105 | if errorlevel 1 exit /b 1 106 | echo. 107 | echo.Build finished; now you can run HTML Help Workshop with the ^ 108 | .hhp project file in %BUILDDIR%/htmlhelp. 109 | goto end 110 | ) 111 | 112 | if "%1" == "qthelp" ( 113 | %SPHINXBUILD% -b qthelp %ALLSPHINXOPTS% %BUILDDIR%/qthelp 114 | if errorlevel 1 exit /b 1 115 | echo. 116 | echo.Build finished; now you can run "qcollectiongenerator" with the ^ 117 | .qhcp project file in %BUILDDIR%/qthelp, like this: 118 | echo.^> qcollectiongenerator %BUILDDIR%\qthelp\complexity.qhcp 119 | echo.To view the help file: 120 | echo.^> assistant -collectionFile %BUILDDIR%\qthelp\complexity.ghc 121 | goto end 122 | ) 123 | 124 | if "%1" == "devhelp" ( 125 | %SPHINXBUILD% -b devhelp %ALLSPHINXOPTS% %BUILDDIR%/devhelp 126 | if errorlevel 1 exit /b 1 127 | echo. 128 | echo.Build finished. 129 | goto end 130 | ) 131 | 132 | if "%1" == "epub" ( 133 | %SPHINXBUILD% -b epub %ALLSPHINXOPTS% %BUILDDIR%/epub 134 | if errorlevel 1 exit /b 1 135 | echo. 136 | echo.Build finished. The epub file is in %BUILDDIR%/epub. 137 | goto end 138 | ) 139 | 140 | if "%1" == "latex" ( 141 | %SPHINXBUILD% -b latex %ALLSPHINXOPTS% %BUILDDIR%/latex 142 | if errorlevel 1 exit /b 1 143 | echo. 144 | echo.Build finished; the LaTeX files are in %BUILDDIR%/latex. 145 | goto end 146 | ) 147 | 148 | if "%1" == "latexpdf" ( 149 | %SPHINXBUILD% -b latex %ALLSPHINXOPTS% %BUILDDIR%/latex 150 | cd %BUILDDIR%/latex 151 | make all-pdf 152 | cd %BUILDDIR%/.. 153 | echo. 154 | echo.Build finished; the PDF files are in %BUILDDIR%/latex. 155 | goto end 156 | ) 157 | 158 | if "%1" == "latexpdfja" ( 159 | %SPHINXBUILD% -b latex %ALLSPHINXOPTS% %BUILDDIR%/latex 160 | cd %BUILDDIR%/latex 161 | make all-pdf-ja 162 | cd %BUILDDIR%/.. 163 | echo. 164 | echo.Build finished; the PDF files are in %BUILDDIR%/latex. 165 | goto end 166 | ) 167 | 168 | if "%1" == "text" ( 169 | %SPHINXBUILD% -b text %ALLSPHINXOPTS% %BUILDDIR%/text 170 | if errorlevel 1 exit /b 1 171 | echo. 172 | echo.Build finished. The text files are in %BUILDDIR%/text. 173 | goto end 174 | ) 175 | 176 | if "%1" == "man" ( 177 | %SPHINXBUILD% -b man %ALLSPHINXOPTS% %BUILDDIR%/man 178 | if errorlevel 1 exit /b 1 179 | echo. 180 | echo.Build finished. The manual pages are in %BUILDDIR%/man. 181 | goto end 182 | ) 183 | 184 | if "%1" == "texinfo" ( 185 | %SPHINXBUILD% -b texinfo %ALLSPHINXOPTS% %BUILDDIR%/texinfo 186 | if errorlevel 1 exit /b 1 187 | echo. 188 | echo.Build finished. The Texinfo files are in %BUILDDIR%/texinfo. 189 | goto end 190 | ) 191 | 192 | if "%1" == "gettext" ( 193 | %SPHINXBUILD% -b gettext %I18NSPHINXOPTS% %BUILDDIR%/locale 194 | if errorlevel 1 exit /b 1 195 | echo. 196 | echo.Build finished. The message catalogs are in %BUILDDIR%/locale. 197 | goto end 198 | ) 199 | 200 | if "%1" == "changes" ( 201 | %SPHINXBUILD% -b changes %ALLSPHINXOPTS% %BUILDDIR%/changes 202 | if errorlevel 1 exit /b 1 203 | echo. 204 | echo.The overview file is in %BUILDDIR%/changes. 205 | goto end 206 | ) 207 | 208 | if "%1" == "linkcheck" ( 209 | %SPHINXBUILD% -b linkcheck %ALLSPHINXOPTS% %BUILDDIR%/linkcheck 210 | if errorlevel 1 exit /b 1 211 | echo. 212 | echo.Link check complete; look for any errors in the above output ^ 213 | or in %BUILDDIR%/linkcheck/output.txt. 214 | goto end 215 | ) 216 | 217 | if "%1" == "doctest" ( 218 | %SPHINXBUILD% -b doctest %ALLSPHINXOPTS% %BUILDDIR%/doctest 219 | if errorlevel 1 exit /b 1 220 | echo. 221 | echo.Testing of doctests in the sources finished, look at the ^ 222 | results in %BUILDDIR%/doctest/output.txt. 223 | goto end 224 | ) 225 | 226 | if "%1" == "xml" ( 227 | %SPHINXBUILD% -b xml %ALLSPHINXOPTS% %BUILDDIR%/xml 228 | if errorlevel 1 exit /b 1 229 | echo. 230 | echo.Build finished. The XML files are in %BUILDDIR%/xml. 231 | goto end 232 | ) 233 | 234 | if "%1" == "pseudoxml" ( 235 | %SPHINXBUILD% -b pseudoxml %ALLSPHINXOPTS% %BUILDDIR%/pseudoxml 236 | if errorlevel 1 exit /b 1 237 | echo. 238 | echo.Build finished. The pseudo-XML files are in %BUILDDIR%/pseudoxml. 239 | goto end 240 | ) 241 | 242 | :end 243 | -------------------------------------------------------------------------------- /docs/conf.py: -------------------------------------------------------------------------------- 1 | # -*- coding: utf-8 -*- 2 | # 3 | # Read the Docs Template documentation build configuration file, created by 4 | # sphinx-quickstart on Tue Aug 26 14:19:49 2014. 5 | # 6 | # This file is execfile()d with the current directory set to its 7 | # containing dir. 8 | # 9 | # Note that not all possible configuration values are present in this 10 | # autogenerated file. 11 | # 12 | # All configuration values have a default; values that are commented out 13 | # serve to show the default. 14 | 15 | import sys 16 | import os 17 | 18 | # If extensions (or modules to document with autodoc) are in another directory, 19 | # add these directories to sys.path here. If the directory is relative to the 20 | # documentation root, use os.path.abspath to make it absolute, like shown here. 21 | #sys.path.insert(0, os.path.abspath('.')) 22 | 23 | # -- General configuration ------------------------------------------------ 24 | 25 | # If your documentation needs a minimal Sphinx version, state it here. 26 | #needs_sphinx = '1.0' 27 | 28 | # Add any Sphinx extension module names here, as strings. They can be 29 | # extensions coming with Sphinx (named 'sphinx.ext.*') or your custom 30 | # ones. 31 | extensions = [] 32 | 33 | # Add any paths that contain templates here, relative to this directory. 34 | templates_path = ['_templates'] 35 | 36 | # The suffix of source filenames. 37 | source_suffix = '.rst' 38 | 39 | # The encoding of source files. 40 | #source_encoding = 'utf-8-sig' 41 | 42 | # The master toctree document. 43 | master_doc = 'index' 44 | 45 | # General information about the project. 46 | project = u'conn-check' 47 | copyright = u'2015, Canonical Ltd.' 48 | 49 | # The version info for the project you're documenting, acts as replacement for 50 | # |version| and |release|, also used in various other places throughout the 51 | # built documents. 52 | # 53 | # The short X.Y version. 54 | version = '1.1' 55 | # The full version, including alpha/beta/rc tags. 56 | release = '1.1.0' 57 | 58 | # The language for content autogenerated by Sphinx. Refer to documentation 59 | # for a list of supported languages. 60 | #language = None 61 | 62 | # There are two options for replacing |today|: either, you set today to some 63 | # non-false value, then it is used: 64 | #today = '' 65 | # Else, today_fmt is used as the format for a strftime call. 66 | #today_fmt = '%B %d, %Y' 67 | 68 | # List of patterns, relative to source directory, that match files and 69 | # directories to ignore when looking for source files. 70 | exclude_patterns = ['_build'] 71 | 72 | # The reST default role (used for this markup: `text`) to use for all 73 | # documents. 74 | #default_role = None 75 | 76 | # If true, '()' will be appended to :func: etc. cross-reference text. 77 | #add_function_parentheses = True 78 | 79 | # If true, the current module name will be prepended to all description 80 | # unit titles (such as .. function::). 81 | #add_module_names = True 82 | 83 | # If true, sectionauthor and moduleauthor directives will be shown in the 84 | # output. They are ignored by default. 85 | #show_authors = False 86 | 87 | # The name of the Pygments (syntax highlighting) style to use. 88 | pygments_style = 'sphinx' 89 | 90 | # A list of ignored prefixes for module index sorting. 91 | #modindex_common_prefix = [] 92 | 93 | # If true, keep warnings as "system message" paragraphs in the built documents. 94 | #keep_warnings = False 95 | 96 | 97 | # -- Options for HTML output ---------------------------------------------- 98 | 99 | # The theme to use for HTML and HTML Help pages. See the documentation for 100 | # a list of builtin themes. 101 | html_theme = 'default' 102 | 103 | # Theme options are theme-specific and customize the look and feel of a theme 104 | # further. For a list of options available for each theme, see the 105 | # documentation. 106 | #html_theme_options = {} 107 | 108 | # Add any paths that contain custom themes here, relative to this directory. 109 | #html_theme_path = [] 110 | 111 | # The name for this set of Sphinx documents. If None, it defaults to 112 | # " v documentation". 113 | #html_title = None 114 | 115 | # A shorter title for the navigation bar. Default is the same as html_title. 116 | #html_short_title = None 117 | 118 | # The name of an image file (relative to this directory) to place at the top 119 | # of the sidebar. 120 | #html_logo = None 121 | 122 | # The name of an image file (within the static path) to use as favicon of the 123 | # docs. This file should be a Windows icon file (.ico) being 16x16 or 32x32 124 | # pixels large. 125 | #html_favicon = None 126 | 127 | # Add any paths that contain custom static files (such as style sheets) here, 128 | # relative to this directory. They are copied after the builtin static files, 129 | # so a file named "default.css" will overwrite the builtin "default.css". 130 | html_static_path = ['static'] 131 | 132 | # Add any extra paths that contain custom files (such as robots.txt or 133 | # .htaccess) here, relative to this directory. These files are copied 134 | # directly to the root of the documentation. 135 | #html_extra_path = [] 136 | 137 | # If not '', a 'Last updated on:' timestamp is inserted at every page bottom, 138 | # using the given strftime format. 139 | #html_last_updated_fmt = '%b %d, %Y' 140 | 141 | # If true, SmartyPants will be used to convert quotes and dashes to 142 | # typographically correct entities. 143 | #html_use_smartypants = True 144 | 145 | # Custom sidebar templates, maps document names to template names. 146 | #html_sidebars = {} 147 | 148 | # Additional templates that should be rendered to pages, maps page names to 149 | # template names. 150 | #html_additional_pages = {} 151 | 152 | # If false, no module index is generated. 153 | #html_domain_indices = True 154 | 155 | # If false, no index is generated. 156 | #html_use_index = True 157 | 158 | # If true, the index is split into individual pages for each letter. 159 | #html_split_index = False 160 | 161 | # If true, links to the reST sources are added to the pages. 162 | #html_show_sourcelink = True 163 | 164 | # If true, "Created using Sphinx" is shown in the HTML footer. Default is True. 165 | #html_show_sphinx = True 166 | 167 | # If true, "(C) Copyright ..." is shown in the HTML footer. Default is True. 168 | #html_show_copyright = True 169 | 170 | # If true, an OpenSearch description file will be output, and all pages will 171 | # contain a tag referring to it. The value of this option must be the 172 | # base URL from which the finished HTML is served. 173 | #html_use_opensearch = '' 174 | 175 | # This is the file name suffix for HTML files (e.g. ".xhtml"). 176 | #html_file_suffix = None 177 | 178 | # Output file base name for HTML help builder. 179 | htmlhelp_basename = 'ConnCheckdoc' 180 | 181 | 182 | # -- Options for LaTeX output --------------------------------------------- 183 | 184 | latex_elements = { 185 | # The paper size ('letterpaper' or 'a4paper'). 186 | #'papersize': 'letterpaper', 187 | 188 | # The font size ('10pt', '11pt' or '12pt'). 189 | #'pointsize': '10pt', 190 | 191 | # Additional stuff for the LaTeX preamble. 192 | #'preamble': '', 193 | } 194 | 195 | # Grouping the document tree into LaTeX files. List of tuples 196 | # (source start file, target name, title, 197 | # author, documentclass [howto, manual, or own class]). 198 | latex_documents = [ 199 | ('index', 'ConnCheck.tex', u'conn-check Documentation', 200 | u'Canonical', 'manual'), 201 | ] 202 | 203 | # The name of an image file (relative to this directory) to place at the top of 204 | # the title page. 205 | #latex_logo = None 206 | 207 | # For "manual" documents, if this is true, then toplevel headings are parts, 208 | # not chapters. 209 | #latex_use_parts = False 210 | 211 | # If true, show page references after internal links. 212 | #latex_show_pagerefs = False 213 | 214 | # If true, show URL addresses after external links. 215 | #latex_show_urls = False 216 | 217 | # Documents to append as an appendix to all manuals. 218 | #latex_appendices = [] 219 | 220 | # If false, no module index is generated. 221 | #latex_domain_indices = True 222 | 223 | 224 | # -- Options for manual page output --------------------------------------- 225 | 226 | # One entry per manual page. List of tuples 227 | # (source start file, name, description, authors, manual section). 228 | man_pages = [ 229 | ('index', 'conncheck', u'conn-check Documentation', 230 | [u'Canonical'], 1) 231 | ] 232 | 233 | # If true, show URL addresses after external links. 234 | #man_show_urls = False 235 | 236 | 237 | # -- Options for Texinfo output ------------------------------------------- 238 | 239 | # Grouping the document tree into Texinfo files. List of tuples 240 | # (source start file, target name, title, author, 241 | # dir menu entry, description, category) 242 | texinfo_documents = [ 243 | ('index', 'ConnCheck', u'conn-check Documentation', 244 | u'Canonical', 'ConnCheck', 'Network connection checking utility.', 245 | 'Miscellaneous'), 246 | ] 247 | 248 | # Documents to append as an appendix to all manuals. 249 | #texinfo_appendices = [] 250 | 251 | # If false, no module index is generated. 252 | #texinfo_domain_indices = True 253 | 254 | # How to display URL addresses: 'footnote', 'no', or 'inline'. 255 | #texinfo_show_urls = 'footnote' 256 | 257 | # If true, do not generate a @detailmenu in the "Top" node's menu. 258 | #texinfo_no_detailmenu = False 259 | -------------------------------------------------------------------------------- /README.rst: -------------------------------------------------------------------------------- 1 | About conn-check 2 | ================ 3 | 4 | conn-check allows for checking connectivity with external services. 5 | 6 | You can write a config file that defines services that you need to 7 | have access to, and conn-check will check connectivity with each. 8 | 9 | It supports various types of services, all of which allow for 10 | basic network checks, but some allow for confirming credentials 11 | work also. 12 | 13 | Configuration 14 | ------------- 15 | 16 | The configuration is done via a yaml file. The file defines a list 17 | of checks to do: 18 | 19 | .. code-block:: yaml 20 | 21 | - type: tcp 22 | host: localhost 23 | port: 80 24 | - type: tls 25 | host: localhost 26 | port: 443 27 | disable_tls_verification: false 28 | 29 | Each check defines a type, and then options as appropriate for that type. 30 | 31 | For a step by step guide on configuring conn-check for your application 32 | `see the tutorial `_. 33 | 34 | Check Types 35 | ----------- 36 | 37 | tcp 38 | ``` 39 | 40 | A simple tcp connectivity check. 41 | 42 | host 43 | The host. 44 | 45 | port 46 | The port. 47 | 48 | timeout 49 | Optional connection timeout in seconds. Default: 10 (or value from ``--connect-timeout``). 50 | 51 | 52 | tls 53 | ``` 54 | 55 | A check that uses TLS (`ssl` is a deprecated alias for this type). 56 | 57 | host 58 | The host. 59 | 60 | port 61 | The port. 62 | 63 | disable_tls_verification 64 | Optional flag to disable verification of TLS certs and handshake. Default: 65 | false. 66 | 67 | timeout 68 | Optional connection timeout in seconds. Default: 10 (or value from ``--connect-timeout``). 69 | 70 | 71 | udp 72 | ``` 73 | 74 | Check that sending a specific UDP packet gets a specific response. 75 | 76 | host 77 | The host. 78 | 79 | port 80 | The port. 81 | 82 | send 83 | The string to send. 84 | 85 | expect 86 | The string to expect in the response. 87 | 88 | timeout 89 | Optional connection timeout in seconds. Default: 10 (or value from ``--connect-timeout``). 90 | 91 | 92 | http 93 | ```` 94 | 95 | Check that a HTTP/HTTPS request succeeds (`https` also works). 96 | 97 | url 98 | The URL to fetch. 99 | 100 | method 101 | Optional HTTP method to use. Default: "GET". 102 | 103 | expected_code 104 | Optional status code that defines success. Default: 200. 105 | 106 | proxy_url 107 | Optional HTTP/HTTPS proxy URL to connect via, including protocol, 108 | if set proxy_{host,port} are ignored. 109 | 110 | proxy_host 111 | Optional HTTP/HTTPS proxy to connect via. 112 | 113 | proxy_port 114 | Optional port to use with ``proxy_host``. Default: 8000. 115 | 116 | headers: 117 | Optional headers to send, as a dict of key-values. Multiple values can be 118 | given as a list/tuple of lists/tuples, e.g.: 119 | ``[('foo', 'bar'), ('foo', 'baz')]`` 120 | 121 | body: 122 | Optional raw request body string to send. 123 | 124 | disable_tls_verification: 125 | Optional flag to disable verification of TLS certs and handshake. Default: 126 | false. 127 | 128 | timeout 129 | Optional connection timeout in seconds. Default: 10 (or value from ``--connect-timeout``). 130 | 131 | allow_redirects 132 | Optional flag to Follow 30x redirects. Default: false. 133 | 134 | params 135 | Optional dict of params to URL encode and pass in the querystring. 136 | 137 | cookies 138 | Optional dict of cookies to pass in the request headers. 139 | 140 | auth 141 | Optional `basic HTTP auth `_ 142 | credentials, as a tuple/list: ``(username, password)``. 143 | 144 | digest_auth 145 | Optional `digest HTTP auth `_ 146 | credentials, as a tuple/list: ``(username, password)``. 147 | 148 | 149 | amqp 150 | ```` 151 | 152 | Check that an AMQP server can be authenticated against. 153 | 154 | host 155 | The host. 156 | 157 | port 158 | The port. 159 | 160 | username 161 | The username to authenticate with. 162 | 163 | password 164 | The password to authenticate with. 165 | 166 | use_tls 167 | Optional flag whether to connect with TLS. Default: true. 168 | 169 | vhost 170 | Optional vhost name to connect to. Default '/'. 171 | 172 | timeout 173 | Optional connection timeout in seconds. Default: 10 (or value from ``--connect-timeout``). 174 | 175 | 176 | postgres 177 | ```````` 178 | 179 | Check that a PostgreSQL db can be authenticated against (`postgresql` also works). 180 | 181 | host 182 | The host. 183 | 184 | port 185 | The port. 186 | 187 | username 188 | The username to authenticate with. 189 | 190 | password 191 | The password to authenticate with. 192 | 193 | database 194 | The database to connect to. 195 | 196 | timeout 197 | Optional connection timeout in seconds. Default: 10 (or value from ``--connect-timeout``). 198 | 199 | 200 | redis 201 | ````` 202 | 203 | Check that a redis server is present, optionally checking authentication. 204 | 205 | host 206 | The host. 207 | 208 | port 209 | The port. 210 | 211 | password 212 | Optional password to authenticatie with. 213 | 214 | timeout 215 | Optional connection timeout in seconds. Default: 10 (or value from ``--connect-timeout``). 216 | 217 | 218 | memcache 219 | ```````` 220 | 221 | Check that a memcached server is present (`memcached` also works). 222 | 223 | host 224 | The host. 225 | 226 | port 227 | The port. 228 | 229 | timeout 230 | Optional connection timeout in seconds. Default: 10 (or value from ``--connect-timeout``). 231 | 232 | 233 | mongodb 234 | ``````` 235 | 236 | Check that a MongoDB server is present (`mongo` also works). 237 | 238 | host 239 | The host. 240 | 241 | port 242 | Optional port. Default: 27017. 243 | 244 | username 245 | Optional username to authenticate with. 246 | 247 | password 248 | Optional password to authenticate with. 249 | 250 | database 251 | Optional database name to connect to, if not set the ``test`` database will be used, 252 | if this database does not exist (or is not available to the user) you will need to 253 | provide a database name. 254 | 255 | timeout 256 | Optional connection timeout in seconds. Default: 10 (or value from ``--connect-timeout``). 257 | 258 | 259 | smtp 260 | ```` 261 | 262 | Check that we can reach, authenticate with and send an email using an SMTP server. 263 | 264 | **Note 1**: if this check succeeds an email is actually sent to the email 265 | defined in ``to_address``, be careful how this is check is configured so it doesn't 266 | unintentionally spam anyone. 267 | 268 | **Note 2**: only EHLO/HELO over a TLS connection is supported with the ``use_tls`` 269 | flag, this check cannot currently create new TLS connection using the 270 | `STARTTLS Extension `_. 271 | 272 | host 273 | The host. 274 | 275 | port 276 | The port, normally 465 for TLS and 25 for plaintext. 277 | 278 | username 279 | Username to authenticate with. 280 | 281 | password 282 | Password to authenticate with. 283 | 284 | from_address: 285 | Email address to send `from`. 286 | 287 | to_address: 288 | Email address to send `to`. 289 | 290 | message: 291 | Optional email body. 292 | 293 | subject: 294 | Optional email subject. 295 | 296 | helo_fallback: 297 | Optional flag that defines whether to fall back to ``HELO`` if the ``EHLO`` 298 | extended command set fails. 299 | 300 | use_tls: 301 | Optional flag to enable TLS security on connection. Default: true. 302 | 303 | timeout 304 | Optional connection timeout in seconds. Default: 10 (or value from ``--connect-timeout``). 305 | 306 | 307 | Tags 308 | ---- 309 | 310 | Every check type also supports a ``tags`` field, which is a list of tags that 311 | can be used with the ``--include-tags`` and ``--exclude-tags`` arguments to conn-check. 312 | 313 | Example YAML: 314 | 315 | .. code-block:: yaml 316 | 317 | - type: http 318 | url: http://google.com/ 319 | tags: 320 | - external 321 | 322 | To run just "external" checks:: 323 | 324 | conn-check --include-tags=external ... 325 | 326 | To run all the checks *except* external:: 327 | 328 | conn-check --exclude-tags=external 329 | 330 | Buffered/Ordered output 331 | ----------------------- 332 | 333 | conn-check normally executes with output to ``STDOUT`` buffered so that the output can be ordered, 334 | with failed checks being printed first, grouping by destination etc. 335 | 336 | If you'd rather see results as they available you can use the ``-U``/``--unbuffered-output`` option 337 | to disable buffering. 338 | 339 | Generating firewall rules 340 | ------------------------- 341 | 342 | conn-check includes the ``conn-check-export-fw`` utility which takes the same arguments as 343 | ``conn-check`` but runs using ``--dry-run`` mode and outputs a set of `egress` firewall 344 | rules in an easy to parse YAML format, for example: 345 | 346 | .. code-block:: yaml 347 | 348 | # Generated from the conn-check demo.yaml file 349 | egress: 350 | - from_host: mydevmachine 351 | ports: [8080] 352 | protocol: udp 353 | to_host: localhost 354 | - from_host: mydevmachine 355 | ports: [80, 443] 356 | protocol: tcp 357 | to_host: login.ubuntu.com 358 | - from_host: mydevmachine 359 | ports: [6379, 11211] 360 | protocol: tcp 361 | to_host: 127.0.0.1 362 | 363 | You can then use this output to generate your environments firewall rules (e.g. with 364 | `EC2 security groups`, `OpenStack Neutron`, `iptables` etc.). 365 | 366 | ``conn-check-convert-fw`` is a utility that does just this, it accepts multiple firewall 367 | rule YAML files, merges/de-dupes them, and outputs commands for AWS, Openstack Neutron, 368 | OpenStack Nova (client), iptables, and ufw (mostly for testing purposes). 369 | 370 | It is designed for this workflow: 371 | 372 | * On each host you run conn-check from, you run ``conn-check-export-fw`` to generate 373 | a YAML file containing egress firewall rules. 374 | * Each of these files is transfered to a host with the correct DNS entries for the 375 | egress hosts. 376 | * On this host ``conn-check-convert-fw`` is run to generate a set of commands 377 | for your firewall. 378 | * These commands are audited by a human / possibly merged with other rules, such as 379 | adding ingress rules, and then run to update your environment's firewall. 380 | 381 | Building wheels 382 | --------------- 383 | 384 | To allow for easier/more portable distribution of this tool you can build 385 | conn-check and all its dependencies as `Python wheels `_:: 386 | 387 | make clean-wheels 388 | make build-wheels 389 | make build-wheels-extra EXTRA=amqp 390 | make build-wheels-extra EXTRA=redis 391 | 392 | The `build-wheels` make target will build conn-check and its base 393 | dependencies, but to include the optional extra dependencies for other 394 | checks such as amqp, redis or postgres you need to use the 395 | `build-wheels-extra` target with the `EXTRA` env value. 396 | 397 | By default all the wheels will be placed in `./wheels`. 398 | 399 | 400 | Automatically generating conn-check YAML configurations 401 | ------------------------------------------------------- 402 | 403 | The `conn-check-configs `_ package contains utilities/libraries 404 | for generating checks from existing application configurations and environments, e.g. from Django settings modules 405 | and Juju environments. 406 | -------------------------------------------------------------------------------- /conn_check/check_impl.py: -------------------------------------------------------------------------------- 1 | import sys 2 | import time 3 | 4 | from twisted.internet import reactor 5 | from twisted.internet.defer import ( 6 | returnValue, 7 | inlineCallbacks, 8 | maybeDeferred as _maybeDeferred, 9 | DeferredList, 10 | Deferred) 11 | from twisted.python.failure import Failure 12 | 13 | 14 | def maybeDeferred(f, *args, **kwargs): 15 | """Wrapper to Twisted's maybeDeferred. 16 | 17 | Adds a reference to the wrapped deferred instance inside the original 18 | function dict so we can track if something is a deferred. 19 | """ 20 | deferred = _maybeDeferred(f, *args, **kwargs) 21 | f.func_dict['deferred'] = deferred 22 | return deferred 23 | 24 | 25 | def maybeDeferToThread(f, *args, **kwargs): 26 | """Call the function C{f} using a thread from the given threadpool 27 | 28 | Return sthe result as a Deferred. 29 | 30 | @param f: The function to call. May return a deferred. 31 | @param *args: positional arguments to pass to f. 32 | @param **kwargs: keyword arguments to pass to f. 33 | 34 | @return: A Deferred which fires a callback with the result of f, or an 35 | errback with a L{twisted.python.failure.Failure} if f throws an 36 | exception. 37 | """ 38 | threadpool = reactor.getThreadPool() 39 | 40 | d = Deferred() 41 | 42 | def realOnResult(result): 43 | if not isinstance(result, Failure): 44 | reactor.callFromThread(d.callback, result) 45 | else: 46 | reactor.callFromThread(d.errback, result) 47 | 48 | def onResult(success, result): 49 | assert success 50 | assert isinstance(result, Deferred) 51 | result.addBoth(realOnResult) 52 | 53 | threadpool.callInThreadWithCallback(onResult, maybeDeferred, 54 | f, *args, **kwargs) 55 | 56 | return d 57 | 58 | 59 | class Check(object): 60 | """Abstract base class for objects embodying connectivity checks.""" 61 | 62 | def check(self, pattern, results): 63 | """Run this check, if it matches the pattern. 64 | 65 | If the pattern matches, and this is a leaf node in the check tree, 66 | implementations of Check.check should call 67 | results.notify_start, then either results.notify_success or 68 | results.notify_failure. 69 | """ 70 | raise NotImplementedError("{}.check not implemented".format( 71 | type(self))) 72 | 73 | def skip(self, pattern, results): 74 | """Indicate that this check has been skipped. 75 | 76 | If the pattern matches and this is a leaf node in the check tree, 77 | implementations of Check.skip should call results.notify_skip. 78 | """ 79 | raise NotImplementedError("{}.skip not implemented".format( 80 | type(self))) 81 | 82 | 83 | class ConditionalCheck(Check): 84 | """A Check that skips unless the given predicate is true at check time.""" 85 | 86 | def __init__(self, wrapped, predicate): 87 | """Initialize an instance.""" 88 | super(ConditionalCheck, self).__init__() 89 | self.wrapped = wrapped 90 | self.predicate = predicate 91 | 92 | def check(self, pattern, result): 93 | """Skip the check.""" 94 | if self.predicate(): 95 | return self.wrapped.check(pattern, result) 96 | else: 97 | self.skip(pattern, result) 98 | 99 | def skip(self, pattern, result): 100 | """Skip the check.""" 101 | self.wrapped.skip(pattern, result) 102 | 103 | 104 | class ResultTracker(object): 105 | """Base class for objects which report or record check results.""" 106 | 107 | def notify_start(self, name, info): 108 | """Register the start of a check.""" 109 | 110 | def notify_skip(self, name): 111 | """Register a check being skipped.""" 112 | 113 | def notify_success(self, name, duration): 114 | """Register a successful check.""" 115 | 116 | def notify_failure(self, name, info, exc_info, duration): 117 | """Register the failure of a check.""" 118 | 119 | 120 | class PrefixResultWrapper(ResultTracker): 121 | """ResultWrapper wrapper which adds a prefix to recorded results.""" 122 | 123 | def __init__(self, wrapped, prefix): 124 | """Initialize an instance.""" 125 | super(PrefixResultWrapper, self).__init__() 126 | self.wrapped = wrapped 127 | self.prefix = prefix 128 | 129 | def make_name(self, name): 130 | """Make a name by prepending the prefix.""" 131 | return "{}{}".format(self.prefix, name) 132 | 133 | def notify_skip(self, name): 134 | """Register a check being skipped.""" 135 | self.wrapped.notify_skip(self.make_name(name)) 136 | 137 | def notify_start(self, name, info): 138 | """Register the start of a check.""" 139 | self.wrapped.notify_start(self.make_name(name), info) 140 | 141 | def notify_success(self, name, duration): 142 | """Register success.""" 143 | self.wrapped.notify_success(self.make_name(name), duration) 144 | 145 | def notify_failure(self, name, info, exc_info, duration): 146 | """Register failure.""" 147 | self.wrapped.notify_failure(self.make_name(name), 148 | info, exc_info, duration) 149 | 150 | 151 | class FailureCountingResultWrapper(ResultTracker): 152 | """ResultWrapper wrapper which counts failures.""" 153 | 154 | def __init__(self, wrapped): 155 | """Initialize an instance.""" 156 | super(FailureCountingResultWrapper, self).__init__() 157 | self.wrapped = wrapped 158 | self.failure_count = 0 159 | 160 | def notify_skip(self, name): 161 | """Register a check being skipped.""" 162 | self.wrapped.notify_skip(name) 163 | 164 | def notify_start(self, name, info): 165 | """Register the start of a check.""" 166 | self.failure_count += 1 167 | self.wrapped.notify_start(name, info) 168 | 169 | def notify_success(self, name, duration): 170 | """Register success.""" 171 | self.failure_count -= 1 172 | self.wrapped.notify_success(name, duration) 173 | 174 | def notify_failure(self, name, info, exc_info, duration): 175 | """Register failure.""" 176 | self.wrapped.notify_failure(name, info, exc_info, duration) 177 | 178 | def any_failed(self): 179 | """Return True if any checks using this wrapper failed so far.""" 180 | return self.failure_count > 0 181 | 182 | 183 | class FunctionCheck(Check): 184 | """A Check which takes a check function.""" 185 | 186 | def __init__(self, name, check, info=None, blocking=False): 187 | """Initialize an instance.""" 188 | super(FunctionCheck, self).__init__() 189 | self.name = name 190 | self.info = info 191 | self.check_fn = check 192 | self.blocking = blocking 193 | 194 | @inlineCallbacks 195 | def check(self, pattern, results): 196 | """Call the check function.""" 197 | if not pattern.matches(self.name): 198 | returnValue(None) 199 | results.notify_start(self.name, self.info) 200 | start = time.time() 201 | try: 202 | if self.blocking: 203 | result = yield maybeDeferToThread(self.check_fn) 204 | else: 205 | result = yield maybeDeferred(self.check_fn) 206 | results.notify_success(self.name, time.time() - start) 207 | returnValue(result) 208 | except Exception: 209 | results.notify_failure(self.name, self.info, 210 | sys.exc_info(), time.time() - start) 211 | 212 | def skip(self, pattern, results): 213 | """Record the skip.""" 214 | if not pattern.matches(self.name): 215 | return 216 | results.notify_skip(self.name) 217 | 218 | 219 | class MultiCheck(Check): 220 | """A composite check comprised of multiple subchecks.""" 221 | 222 | def __init__(self, subchecks, strategy): 223 | """Initialize an instance.""" 224 | super(MultiCheck, self).__init__() 225 | self.subchecks = list(subchecks) 226 | self.strategy = strategy 227 | 228 | def check(self, pattern, results): 229 | """Run subchecks using the strategy supplied at creation time.""" 230 | return self.strategy(self.subchecks, pattern, results) 231 | 232 | def skip(self, pattern, results): 233 | """Skip subchecks.""" 234 | for subcheck in self.subchecks: 235 | subcheck.skip(pattern, results) 236 | 237 | 238 | class PrefixCheckWrapper(Check): 239 | """Runs a given check, adding a prefix to its name. 240 | 241 | This works by wrapping the pattern and result tracker objects 242 | passed to .check and .skip. 243 | """ 244 | 245 | def __init__(self, wrapped, prefix): 246 | """Initialize an instance.""" 247 | super(PrefixCheckWrapper, self).__init__() 248 | self.wrapped = wrapped 249 | self.prefix = prefix 250 | 251 | def do_subcheck(self, subcheck, pattern, results): 252 | """Do a subcheck if the pattern could still match.""" 253 | pattern = pattern.assume_prefix(self.prefix) 254 | if not pattern.failed(): 255 | results = PrefixResultWrapper(wrapped=results, 256 | prefix=self.prefix) 257 | return subcheck(pattern, results) 258 | 259 | def check(self, pattern, results): 260 | """Run the check, prefixing results.""" 261 | return self.do_subcheck(self.wrapped.check, pattern, results) 262 | 263 | def skip(self, pattern, results): 264 | """Skip checks, prefixing results.""" 265 | self.do_subcheck(self.wrapped.skip, pattern, results) 266 | 267 | 268 | @inlineCallbacks 269 | def skipping_strategy(subchecks, pattern, results): 270 | """Strategy used to print checks out by just skipping them all.""" 271 | # We need at least one yield to make this a generator 272 | yield 273 | for subcheck in subchecks: 274 | subcheck.skip(pattern, results) 275 | 276 | 277 | @inlineCallbacks 278 | def sequential_strategy(subchecks, pattern, results): 279 | """Run subchecks sequentially, skipping checks after the first failure. 280 | 281 | This is most useful when the failure of one check in the sequence 282 | would imply the failure of later checks -- for example, it probably 283 | doesn't make sense to run an SSL check if the basic TCP check failed. 284 | 285 | Use sequential_check to create a meta-check using this strategy. 286 | """ 287 | local_results = FailureCountingResultWrapper(wrapped=results) 288 | failed = False 289 | for subcheck in subchecks: 290 | if failed: 291 | subcheck.skip(pattern, local_results) 292 | else: 293 | yield maybeDeferred(subcheck.check, pattern, local_results) 294 | if local_results.any_failed(): 295 | failed = True 296 | 297 | 298 | def parallel_strategy(subchecks, pattern, results): 299 | """A strategy which runs the given subchecks in parallel. 300 | 301 | Most checks can potentially block for long periods, and shouldn't have 302 | interdependencies, so it makes sense to run them in parallel to 303 | shorten the overall run time. 304 | 305 | Use parallel_check to create a meta-check using this strategy. 306 | """ 307 | deferreds = [maybeDeferred(subcheck.check, pattern, results) 308 | for subcheck in subchecks] 309 | return DeferredList(deferreds) 310 | 311 | 312 | def skipping_check(subchecks): 313 | """Return a check that skips everything, used for printing checks.""" 314 | return MultiCheck(subchecks=subchecks, strategy=skipping_strategy) 315 | 316 | 317 | def parallel_check(subchecks): 318 | """Return a check that runs the given subchecks in parallel.""" 319 | return MultiCheck(subchecks=subchecks, strategy=parallel_strategy) 320 | 321 | 322 | def sequential_check(subchecks): 323 | """Return a check that runs the given subchecks in sequence.""" 324 | return MultiCheck(subchecks=subchecks, strategy=sequential_strategy) 325 | 326 | 327 | def add_check_prefix(*args): 328 | """Return an equivalent check with the given prefix prepended to its name. 329 | 330 | The final argument should be a check; the remaining arguments are treated 331 | as name components and joined with the check name using periods as 332 | separators. For example, if the name of a check is "baz", then: 333 | 334 | add_check_prefix("foo", "bar", check) 335 | 336 | ...will return a check with the effective name "foo.bar.baz". 337 | """ 338 | args = list(args) 339 | check = args.pop(-1) 340 | path = ".".join(args) 341 | return PrefixCheckWrapper(wrapped=check, prefix="{}:".format(path)) 342 | 343 | 344 | def make_check(name, check, info=None, blocking=False): 345 | """Make a check object from a function.""" 346 | return FunctionCheck(name=name, check=check, info=info, blocking=blocking) 347 | 348 | 349 | def guard_check(check, predicate): 350 | """Wrap a check so that it is skipped unless the predicate is true.""" 351 | return ConditionalCheck(wrapped=check, predicate=predicate) 352 | -------------------------------------------------------------------------------- /conn_check/main.py: -------------------------------------------------------------------------------- 1 | from argparse import ArgumentParser 2 | from collections import defaultdict 3 | import sys 4 | from threading import Thread 5 | import time 6 | import traceback 7 | import yaml 8 | 9 | from twisted.internet import reactor 10 | from twisted.internet.defer import ( 11 | inlineCallbacks, 12 | ) 13 | from twisted.python.threadpool import ThreadPool 14 | 15 | from . import get_version_string 16 | from .check_impl import ( 17 | FailureCountingResultWrapper, 18 | parallel_check, 19 | skipping_check, 20 | ResultTracker, 21 | ) 22 | from .checks import CHECK_ALIASES, CHECKS, load_tls_certs 23 | from .patterns import ( 24 | SimplePattern, 25 | SumPattern, 26 | ) 27 | 28 | 29 | def check_from_description(check_description): 30 | _type = check_description['type'] 31 | 32 | if _type in CHECK_ALIASES: 33 | _type = CHECK_ALIASES[_type] 34 | 35 | check = CHECKS.get(_type, None) 36 | if check is None: 37 | raise AssertionError("Unknown check type: {}, available checks: {}" 38 | .format(_type, CHECKS.keys())) 39 | for arg in check['args']: 40 | if arg not in check_description: 41 | raise AssertionError('{} missing from check: {}'.format(arg, 42 | check_description)) 43 | 44 | if 'port' in check_description: 45 | check_description['port'] = int(check_description['port']) 46 | 47 | res = check['fn'](**check_description) 48 | return res 49 | 50 | 51 | def filter_tags(check, include, exclude): 52 | if not include and not exclude: 53 | return True 54 | 55 | check_tags = set(check.get('tags', [])) 56 | 57 | if include: 58 | result = bool(check_tags.intersection(include)) 59 | else: 60 | result = not bool(check_tags.intersection(exclude)) 61 | 62 | return result 63 | 64 | 65 | def build_checks(check_descriptions, connect_timeout, include_tags, 66 | exclude_tags, skip_checks=False): 67 | def set_timeout(desc): 68 | new_desc = dict(timeout=connect_timeout) 69 | new_desc.update(desc) 70 | return new_desc 71 | 72 | check_descriptions = filter( 73 | lambda c: filter_tags(c, include_tags, exclude_tags), 74 | check_descriptions) 75 | 76 | subchecks = map( 77 | lambda c: check_from_description(c), 78 | map(set_timeout, check_descriptions)) 79 | 80 | if skip_checks: 81 | strategy_wrapper = skipping_check 82 | else: 83 | strategy_wrapper = parallel_check 84 | return strategy_wrapper(subchecks) 85 | 86 | 87 | @inlineCallbacks 88 | def run_checks(checks, pattern, results): 89 | """Make and run all the pertinent checks.""" 90 | try: 91 | yield checks.check(pattern, results) 92 | finally: 93 | reactor.stop() 94 | 95 | 96 | class NagiosCompatibleArgsParser(ArgumentParser): 97 | 98 | def error(self, message): 99 | """A patched version of ArgumentParser.error. 100 | 101 | Does the same thing as ArgumentParser.error, e.g. prints an error 102 | message and exits, but does so with an exit code of 3 rather than 2, 103 | to maintain compatibility with Nagios checks. 104 | """ 105 | self.print_usage(sys.stderr) 106 | self.exit(3, '{}: error: {}\n'.format(self.prog, message)) 107 | 108 | 109 | class TimestampOutput(object): 110 | 111 | def __init__(self, output): 112 | self.start = time.time() 113 | self.output = output 114 | 115 | def write(self, data): 116 | self.output.write("{:.3f}: {}".format(time.time() - self.start, data)) 117 | 118 | 119 | class OrderedOutput(object): 120 | """Outputs check results ordered by FAILED, SUCCESSFUL, SKIPPED checks.""" 121 | 122 | def __init__(self, output): 123 | self.output = output 124 | 125 | self.failed = defaultdict(list) 126 | self.messages = defaultdict(list) 127 | self.skipped = [] 128 | 129 | def write(self, data): 130 | if data[:7] == 'SKIPPED': 131 | self.skipped.append(data) 132 | return 133 | 134 | name, message = data.split(' ', 1) 135 | 136 | # Standard check name format is {type}:{host}:{port} 137 | name_parts = name.split(':', 2) 138 | try: 139 | name_parts[2] = '' 140 | except IndexError: 141 | pass 142 | name = ':'.join(name_parts) 143 | 144 | if message[0:6] == 'FAILED': 145 | self.failed[name].append(data) 146 | else: 147 | self.messages[name].append(data) 148 | 149 | def flush(self): 150 | for _type in ('failed', 'messages'): 151 | for name, messages in sorted(getattr(self, _type).items()): 152 | messages.sort() 153 | map(self.output.write, messages) 154 | 155 | self.skipped.sort() 156 | map(self.output.write, self.skipped) 157 | 158 | 159 | class ConsoleOutput(ResultTracker): 160 | """Outputs check results to STDOUT.""" 161 | 162 | def __init__(self, output, verbose, show_tracebacks, show_duration): 163 | """Initialize an instance.""" 164 | super(ConsoleOutput, self).__init__() 165 | self.output = output 166 | self.verbose = verbose 167 | self.show_tracebacks = show_tracebacks 168 | self.show_duration = show_duration 169 | 170 | def format_duration(self, duration): 171 | if not self.show_duration: 172 | return "" 173 | return ": ({:.3f} ms)".format(duration) 174 | 175 | def notify_start(self, name, info): 176 | """Register the start of a check.""" 177 | if self.verbose: 178 | if info: 179 | info = " ({})".format(info) 180 | else: 181 | info = '' 182 | self.output.write("Starting {}{}...\n".format(name, info)) 183 | 184 | def notify_skip(self, name): 185 | """Register a check being skipped.""" 186 | self.output.write("SKIPPED: {}\n".format(name)) 187 | 188 | def notify_success(self, name, duration): 189 | """Register a success.""" 190 | self.output.write("{} OK{}\n".format( 191 | name, self.format_duration(duration))) 192 | 193 | def notify_failure(self, name, info, exc_info, duration): 194 | """Register a failure.""" 195 | message = str(exc_info[1]).split("\n")[0] 196 | if info: 197 | message = "({}) {}".format(info, message) 198 | self.output.write("{} FAILED{} - {}\n".format( 199 | name, self.format_duration(duration), message)) 200 | 201 | if self.show_tracebacks: 202 | formatted = traceback.format_exception(exc_info[0], 203 | exc_info[1], 204 | exc_info[2], 205 | None) 206 | lines = "".join(formatted).split("\n") 207 | if len(lines) > 0 and len(lines[-1]) == 0: 208 | lines.pop() 209 | indented = "\n".join([" {}".format(line) for line in lines]) 210 | self.output.write("{}\n".format(indented)) 211 | 212 | 213 | class Command(object): 214 | """CLI command runner for the main conn-check endpoint.""" 215 | 216 | def __init__(self, args): 217 | self.make_arg_parser() 218 | self.parse_options(args) 219 | self.wrap_output(sys.stdout) 220 | self.load_descriptions() 221 | 222 | def make_arg_parser(self): 223 | """Set up an arg parser with our options.""" 224 | 225 | parser = NagiosCompatibleArgsParser() 226 | parser.add_argument("config_file", 227 | help="Config file specifying the checks to run.") 228 | parser.add_argument("patterns", nargs='*', 229 | help="Patterns to filter the checks.") 230 | parser.add_argument("-v", "--verbose", dest="verbose", 231 | action="store_true", default=False, 232 | help="Show additional status") 233 | parser.add_argument("-d", "--duration", dest="show_duration", 234 | action="store_true", default=False, 235 | help="Show duration") 236 | parser.add_argument("-t", "--tracebacks", dest="show_tracebacks", 237 | action="store_true", default=False, 238 | help="Show tracebacks on failure") 239 | parser.add_argument("--validate", dest="validate", 240 | action="store_true", default=False, 241 | help="Only validate the config file," 242 | " don't run checks.") 243 | parser.add_argument("--version", dest="print_version", 244 | action="store_true", default=False, 245 | help="Print the currently installed version.") 246 | parser.add_argument("--tls-certs-path", dest="cacerts_path", 247 | action="store", default="/etc/ssl/certs/", 248 | help="Path to TLS CA certificates.") 249 | parser.add_argument("--max-timeout", dest="max_timeout", type=float, 250 | action="store", help="Maximum execution time.") 251 | parser.add_argument("--connect-timeout", dest="connect_timeout", 252 | action="store", default=10, type=float, 253 | help="Network connection timeout.") 254 | parser.add_argument("-U", "--unbuffered-output", dest="buffer_output", 255 | action="store_false", default=True, 256 | help="Don't buffer output, write to STDOUT right " 257 | "away.") 258 | parser.add_argument("--dry-run", 259 | dest="dry_run", action="store_true", 260 | default=False, 261 | help="Skip all checks, just print out" 262 | " what would be run.") 263 | group = parser.add_mutually_exclusive_group() 264 | group.add_argument("--include-tags", dest="include_tags", 265 | action="store", default="", 266 | help="Comma separated list of tags to include.") 267 | group.add_argument("--exclude-tags", dest="exclude_tags", 268 | action="store", default="", 269 | help="Comma separated list of tags to exclude.") 270 | self.parser = parser 271 | 272 | def setup_reactor(self): 273 | """Setup the Twisted reactor with required customisations.""" 274 | 275 | def make_daemon_thread(*args, **kw): 276 | """Create a daemon thread.""" 277 | thread = Thread(*args, **kw) 278 | thread.daemon = True 279 | return thread 280 | 281 | threadpool = ThreadPool(minthreads=1) 282 | threadpool.threadFactory = make_daemon_thread 283 | reactor.threadpool = threadpool 284 | reactor.callWhenRunning(threadpool.start) 285 | 286 | if self.options.max_timeout is not None: 287 | def terminator(): 288 | # Hasta la vista, twisted 289 | reactor.stop() 290 | print('Maximum timeout reached: {}s'.format( 291 | self.options.max_timeout)) 292 | 293 | reactor.callLater(self.options.max_timeout, terminator) 294 | 295 | def parse_options(self, args): 296 | """Parse args (e.g. sys.argv) into options and set some config.""" 297 | 298 | options = self.parser.parse_args(list(args)) 299 | 300 | include_tags = [] 301 | if options.include_tags: 302 | include_tags = options.include_tags.split(',') 303 | include_tags = [tag.strip() for tag in include_tags] 304 | options.include_tags = include_tags 305 | 306 | exclude_tags = [] 307 | if options.exclude_tags: 308 | exclude_tags = options.exclude_tags.split(',') 309 | exclude_tags = [tag.strip() for tag in exclude_tags] 310 | options.exclude_tags = exclude_tags 311 | 312 | if options.patterns: 313 | self.patterns = SumPattern(map(SimplePattern, options.patterns)) 314 | else: 315 | self.patterns = SimplePattern("*") 316 | self.options = options 317 | 318 | def wrap_output(self, output): 319 | """Wraps an output stream (e.g. sys.stdout) from options.""" 320 | 321 | if self.options.show_duration: 322 | output = TimestampOutput(output) 323 | if self.options.buffer_output: 324 | # We buffer output so we can order it for human readable output 325 | output = OrderedOutput(output) 326 | 327 | results = ConsoleOutput(output=output, 328 | show_tracebacks=self.options.show_tracebacks, 329 | show_duration=self.options.show_duration, 330 | verbose=self.options.verbose) 331 | if not self.options.dry_run: 332 | results = FailureCountingResultWrapper(results) 333 | 334 | self.output = output 335 | self.results = results 336 | 337 | def load_descriptions(self): 338 | """Pre-load YAML checks file into a descriptions property.""" 339 | 340 | with open(self.options.config_file) as f: 341 | self.descriptions = yaml.load(f) 342 | 343 | def run(self): 344 | """Run/validate/dry-run the given command with options.""" 345 | 346 | checks = build_checks(self.descriptions, 347 | self.options.connect_timeout, 348 | self.options.include_tags, 349 | self.options.exclude_tags, 350 | self.options.dry_run) 351 | 352 | if not self.options.validate: 353 | if not self.options.dry_run: 354 | load_tls_certs(self.options.cacerts_path) 355 | 356 | self.setup_reactor() 357 | reactor.callWhenRunning(run_checks, checks, self.patterns, 358 | self.results) 359 | reactor.run() 360 | 361 | # Flush output, this really only has an effect when running 362 | # buffered output 363 | self.output.flush() 364 | 365 | if not self.options.dry_run and self.results.any_failed(): 366 | return 2 367 | 368 | return 0 369 | 370 | 371 | def parse_version_arg(): 372 | """Manually check for --version in args and output version info. 373 | 374 | We need to do this early because ArgumentParser won't let us mix 375 | and match non-default positional argument with a flag argument. 376 | """ 377 | if '--version' in sys.argv: 378 | sys.stdout.write('conn-check {}\n'.format(get_version_string())) 379 | return True 380 | 381 | 382 | def run(*args): 383 | if parse_version_arg(): 384 | return 0 385 | 386 | cmd = Command(args) 387 | return cmd.run() 388 | 389 | 390 | def main(): 391 | sys.exit(run(*sys.argv[1:])) 392 | 393 | 394 | if __name__ == '__main__': 395 | main() 396 | -------------------------------------------------------------------------------- /debian/changelog: -------------------------------------------------------------------------------- 1 | conn-check (1.3.1-1) trusty; urgency=low 2 | 3 | * Added guards for port numbers and the HTTP checks expected_code to cast 4 | any given value to an int. 5 | 6 | -- Wes Mason (1stvamp) Tue, 11 Aug 2015 17:10:11 +0000 7 | 8 | conn-check (1.3.0-1) trusty; urgency=low 9 | 10 | * conn-check-convert-fw utility added, which generates firewall rule 11 | commands from conn-check FW egress YAML. 12 | 13 | -- Wes Mason (1stvamp) Wed, 15 Jul 2015 15:05:32 +0000 14 | 15 | conn-check (1.2.0-1) trusty; urgency=low 16 | 17 | * New smtp check type added. 18 | 19 | -- Wes Mason (1stvamp) Fri, 19 Jun 2015 15:22:41 +0000 20 | 21 | conn-check (1.1.0-9) trusty; urgency=low 22 | 23 | * Remove python-idna, because I suck and got the package wrong in trusty. 24 | 25 | -- Wes Mason (1stvamp) Mon, 08 Jun 2015 12:51:59 +0000 26 | 27 | conn-check (1.1.0-8) trusty; urgency=low 28 | 29 | * Add python-idna to deps. 30 | 31 | -- Wes Mason (1stvamp) Mon, 08 Jun 2015 12:44:31 +0000 32 | 33 | conn-check (1.1.0-7) trusty; urgency=low 34 | 35 | * Ensure ipaddress is installed first, due to cryptography package ordering. 36 | 37 | -- Wes Mason (1stvamp) Mon, 08 Jun 2015 12:20:09 +0000 38 | 39 | conn-check (1.1.0-6) trusty; urgency=low 40 | 41 | * Building 1.1.0 series with trusty. 42 | 43 | -- Wes Mason (1stvamp) Mon, 08 Jun 2015 09:26:31 +0000 44 | 45 | conn-check (1.1.0-5) vivid; urgency=low 46 | 47 | * Fix for pip with latest dh-virtualenv on vivid builds. 48 | 49 | -- Wes Mason (1stvamp) Fri, 05 Jun 2015 22:05:10 +0000 50 | 51 | conn-check (1.1.0-4) vivid; urgency=low 52 | 53 | * More vivid build fixes, mostly python-pymongo being a bad python citizen. 54 | 55 | -- Wes Mason (1stvamp) Fri, 05 Jun 2015 16:07:14 +0000 56 | 57 | conn-check (1.1.0-3) vivid; urgency=low 58 | 59 | * Patch requirements for pip2pi change. 60 | 61 | -- Wes Mason (1stvamp) Fri, 05 Jun 2015 15:30:02 +0000 62 | 63 | conn-check (1.1.0-2) vivid; urgency=low 64 | 65 | * Missing python-setuptools for vivid builds. 66 | 67 | -- Wes Mason (1stvamp) Fri, 05 Jun 2015 15:06:58 +0000 68 | 69 | conn-check (1.1.0-1) vivid; urgency=low 70 | 71 | * Added new conn-check-export-fw tool to export firewall egress rules in a YAML 72 | format. 73 | * Refactored CLI command handling code to make it easier to extend/override. 74 | 75 | -- Wes Mason (1stvamp) Fri, 05 Jun 2015 12:32:31 +0000 76 | 77 | conn-check (1.0.18-2) trusty; urgency=low 78 | 79 | * Rename ndg-httpsclient requirement as ndg_httpsclient so it can be found 80 | correctly during build. 81 | 82 | -- Wes Mason (1stvamp) Mon, 13 Apr 2015 12:35:01 +0000 83 | 84 | conn-check (1.0.18-1) trusty; urgency=low 85 | 86 | * Ensure pyOpenSSL is always used instead of the ssl modules, 87 | see https://urllib3.readthedocs.org/en/latest/security.html#pyopenssl. 88 | 89 | -- Wes Mason (1stvamp) Mon, 13 Apr 2015 10:21:52 +0000 90 | 91 | conn-check (1.0.17-4) trusty; urgency=low 92 | 93 | * Ensure all non-deb'd deps are definitely vendored at build time. 94 | 95 | -- Wes Mason (1stvamp) Thu, 9 Apr 2015 21:23:49 +0000 96 | 97 | conn-check (1.0.17-3) trusty; urgency=low 98 | 99 | * Unpin six in vendored cryptography egg. 100 | 101 | -- Wes Mason (1stvamp) Thu, 9 Apr 2015 16:06:01 +0000 102 | 103 | conn-check (1.0.17-1) trusty; urgency=low 104 | 105 | * Remove python-requests pin to make precise backport easier. 106 | 107 | -- Wes Mason (1stvamp) Wed, 8 Apr 2015 15:47:12 +0000 108 | 109 | conn-check (1.0.16-3) trusty; urgency=low 110 | 111 | * Add python-enum34 to deps. 112 | 113 | -- Wes Mason (1stvamp) Tue, 17 Mar 2015 21:29:10 +0000 114 | 115 | conn-check (1.0.16-2) trusty; urgency=low 116 | 117 | * Fix issue with enum34 not being vendored in build files. 118 | 119 | -- Wes Mason (1stvamp) Sat, 07 Mar 2015 02:19:13 +0000 120 | 121 | conn-check (1.0.16-1) trusty; urgency=low 122 | 123 | * Add --include-tags and --exclude-tags args with support for the `tags` YAML check field. 124 | 125 | -- Wes Mason (1stvamp) Fri, 06 Mar 2015 16:53:01 +0000 126 | 127 | conn-check (1.0.15-1) trusty; urgency=low 128 | 129 | * Removed need for initial patch by including missing files in packae 130 | manifest. 131 | 132 | -- Wes Mason (1stvamp) Mon, 15 Dec 2014 12:13:40 +0000 133 | 134 | conn-check (1.0.14-4) trusty; urgency=low 135 | 136 | * Re-add python-dev and libssl-dev for build time deps. 137 | 138 | -- Wes Mason (1stvamp) Fri, 12 Dec 2014 22:02:11 +0000 139 | 140 | conn-check (1.0.14-3) trusty; urgency=low 141 | 142 | * Bump for lp rebuild. 143 | 144 | -- Wes Mason (1stvamp) Fri, 12 Dec 2014 21:53:59 +0000 145 | 146 | conn-check (1.0.14-2) trusty; urgency=low 147 | 148 | * Switch to using latest dh-virtualenv with global site-packages support. 149 | * Remove extra conn-check wheel from pythoncache and ensure all requirements 150 | files are vendored instead. 151 | * Remove all upstram deps from pythoncache. 152 | 153 | -- Wes Mason (1stvamp) Wed, 10 Dec 2014 21:22:00 +0000 154 | 155 | conn-check (1.0.14-1) trusty; urgency=low 156 | 157 | * Added better build Makefile. 158 | * Manifest changes released (1.0.14 bump). 159 | 160 | -- Wes Mason (1stvamp) Tue, 09 Dec 2014 20:56:01 +0000 161 | 162 | conn-check (1.0.13-46) trusty; urgency=low 163 | 164 | * Switch secondary deps to Suggests instead of Recommends. 165 | 166 | -- Wes Mason (1stvamp) Thu, 04 Dec 2014 23:32:14 +0000 167 | 168 | conn-check (1.0.13-45) trusty; urgency=low 169 | 170 | * Ensure dh-virtualenv preinstalls pycparser so cffi works at build time. 171 | 172 | -- Wes Mason (1stvamp) Thu, 04 Dec 2014 23:18:01 +0000 173 | 174 | conn-check (1.0.13-44) trusty; urgency=low 175 | 176 | * Ensure we have libpq-dev for psycopg2 build. 177 | 178 | -- Wes Mason (1stvamp) Thu, 04 Dec 2014 23:05:43 +0000 179 | 180 | conn-check (1.0.13-43) trusty; urgency=low 181 | 182 | * Add amqp, postgres, redis and mongo virtualenv dependencies. 183 | * Add python-* dependencies for above to Recommends. 184 | 185 | -- Wes Mason (1stvamp) Thu, 04 Dec 2014 23:05:43 +0000 186 | 187 | conn-check (1.0.13-42) trusty; urgency=low 188 | 189 | * Remove pkg_resources requires for txrequests too. 190 | 191 | -- Wes Mason (1stvamp) Thu, 04 Dec 2014 12:05:01 +0000 192 | 193 | conn-check (1.0.13-41) trusty; urgency=low 194 | 195 | * Remove python-* packages from conn-check packkage requires.txt so 196 | installed entrypoint doesn't try loading pkg_resources for them (we don't 197 | actually need any). 198 | 199 | -- Wes Mason (1stvamp) Wed, 03 Dec 2014 15:42:32 +0000 200 | 201 | conn-check (1.0.13-40) trusty; urgency=low 202 | 203 | * Don't block the build if .so files aren't there to cleanup. 204 | 205 | -- Wes Mason (1stvamp) Wed, 03 Dec 2014 15:42:32 +0000 206 | 207 | conn-check (1.0.13-39) trusty; urgency=low 208 | 209 | * Manually clean up build packages as pip has install-time python path. 210 | 211 | -- Wes Mason (1stvamp) Wed, 03 Dec 2014 15:08:01 +0000 212 | 213 | conn-check (1.0.13-38) trusty; urgency=low 214 | 215 | * Don't ignore errors in post-build venv cleanup. 216 | * Reference venv path directly with $(CURDIR). 217 | 218 | -- Wes Mason (1stvamp) Wed, 03 Dec 2014 14:03:06 +0000 219 | 220 | conn-check (1.0.13-37) trusty; urgency=low 221 | 222 | * Reverted patched requirements that remove python-* deps, for the build 223 | phase, as we remove these manually from venv post-build anyway, and we 224 | need them in requirements.txt to generate a proper pythoncache. 225 | * Remove pycparser post-build as this is just a cffi dep. 226 | 227 | -- Wes Mason (1stvamp) Wed, 03 Dec 2014 13:12:45 +0000 228 | 229 | conn-check (1.0.13-36) trusty; urgency=low 230 | 231 | * Include pyopenssl/cffi et al in pythoncache for virtualenv build (they get 232 | removed from the virtualenv post-build). 233 | 234 | -- Wes Mason (1stvamp) Wed, 03 Dec 2014 13:12:45 +0000 235 | 236 | conn-check (1.0.13-35) trusty; urgency=low 237 | 238 | * Ensure requirements.txt is patched to remove python-* in builder. 239 | 240 | -- Wes Mason (1stvamp) Wed, 03 Dec 2014 13:04:01 +0000 241 | 242 | conn-check (1.0.13-34) trusty; urgency=low 243 | 244 | * Cleanup build deps from virtualenv. 245 | * Ensure we have cffi preinstalled for cryptography build. 246 | * Ensure we have python-pyasn1 and python-zope.interface at install time. 247 | 248 | -- Wes Mason (1stvamp) Wed, 03 Dec 2014 12:04:54 +0000 249 | 250 | conn-check (1.0.13-34) trusty; urgency=low 251 | 252 | * Ensure requirements.txt is the same debian-requirements.txt in orig 253 | payload. 254 | 255 | -- Wes Mason (1stvamp) Wed, 03 Dec 2014 11:48:00 +0000 256 | 257 | conn-check (1.0.13-33) trusty; urgency=low 258 | 259 | * Remove no-global-site-packages.txt as part of dh-virtualenv target to 260 | enable site-packages in virtualenv. 261 | 262 | -- Wes Mason (1stvamp) Wed, 03 Dec 2014 11:32:01 +0000 263 | 264 | conn-check (1.0.13-32) trusty; urgency=low 265 | 266 | * Use python-* packages were possible. 267 | 268 | -- Wes Mason (1stvamp) Wed, 03 Dec 2014 10:51:32 +0000 269 | 270 | conn-check (1.0.13-31) trusty; urgency=low 271 | 272 | * Bump. 273 | 274 | -- Wes Mason (1stvamp) Tue, 02 Dec 2014 23:54:58 +0000 275 | 276 | conn-check (1.0.13-29) trusty; urgency=low 277 | 278 | * Use dh-virtualenv >=0.7. 279 | 280 | -- Wes Mason (1stvamp) Tue, 02 Dec 2014 22:58:01 +0000 281 | 282 | conn-check (1.0.13-28) trusty; urgency=low 283 | 284 | * Add libssl-dev to build deps. 285 | 286 | -- Wes Mason (1stvamp) Tue, 02 Dec 2014 22:58:01 +0000 287 | 288 | conn-check (1.0.13-27) trusty; urgency=low 289 | 290 | * Add libffi-dev to build deps. 291 | 292 | -- Wes Mason (1stvamp) Tue, 02 Dec 2014 21:58:56 +0000 293 | 294 | conn-check (1.0.13-26) trusty; urgency=low 295 | 296 | * Add python-dev to build deps. 297 | 298 | -- Wes Mason (1stvamp) Tue, 02 Dec 2014 17:41:06 +0000 299 | 300 | conn-check (1.0.13-25) trusty; urgency=low 301 | 302 | * Preinstall cffi during build 303 | 304 | -- Wes Mason (1stvamp) Tue, 02 Dec 2014 17:11:05 +0000 305 | 306 | conn-check (1.0.13-24) trusty; urgency=low 307 | 308 | * Preinstall six and setuptools during build 309 | 310 | -- Wes Mason (1stvamp) Tue, 02 Dec 2014 16:22:01 +0000 311 | 312 | conn-check (1.0.13-23) trusty; urgency=low 313 | 314 | * Use PyYAML instead of pyyaml for local deps 315 | 316 | -- Wes Mason (1stvamp) Tue, 02 Dec 2014 15:43:00 +0000 317 | 318 | conn-check (1.0.13-22) trusty; urgency=low 319 | 320 | * Use "Twisted" instead of "twisted" to match case in file:// PyPI URI. 321 | 322 | -- Wes Mason (1stvamp) Tue, 02 Dec 2014 15:12:00 +0000 323 | 324 | conn-check (1.0.13-21) trusty; urgency=low 325 | 326 | * Include entire python cache in debian dir. 327 | 328 | -- Wes Mason (1stvamp) Tue, 02 Dec 2014 14:23:47 +0000 329 | 330 | conn-check (1.0.13-20) trusty; urgency=low 331 | 332 | * Use pythoncache from debian dir. 333 | 334 | -- Wes Mason (1stvamp) Tue, 02 Dec 2014 14:02:01 +0000 335 | 336 | conn-check (1.0.13-19) trusty; urgency=low 337 | 338 | * Extract pythoncache from orig.tar.gz. 339 | 340 | -- Wes Mason (1stvamp) Tue, 02 Dec 2014 13:23:10 +0000 341 | 342 | conn-check (1.0.13-18) trusty; urgency=low 343 | 344 | * Build a full PyPI index and sync via pythoncache.tar. 345 | 346 | -- Wes Mason (1stvamp) Tue, 02 Dec 2014 12:15:30 +0000 347 | 348 | conn-check (1.0.13-17) trusty; urgency=low 349 | 350 | * Set --no-index and --find-links in pip during deb build. 351 | 352 | -- Wes Mason (1stvamp) Tue, 02 Dec 2014 00:42:00 +0000 353 | 354 | conn-check (1.0.13-16) trusty; urgency=low 355 | 356 | * Set pypi-url using $(CURDIR) 357 | 358 | -- Wes Mason (1stvamp) Mon, 01 Dec 2014 23:43:01 +0000 359 | 360 | conn-check (1.0.13-15) trusty; urgency=low 361 | 362 | * Include individual eggs/wheels in include-binaries listing. 363 | 364 | -- Wes Mason (1stvamp) Mon, 01 Dec 2014 23:43:01 +0000 365 | 366 | conn-check (1.0.13-14) trusty; urgency=low 367 | 368 | * Use better egg/wheel caching. 369 | 370 | -- Wes Mason (1stvamp) Mon, 01 Dec 2014 22:02:51 +0000 371 | 372 | conn-check (1.0.13-13) trusty; urgency=low 373 | 374 | * Use cached eggs for pip. 375 | 376 | -- Wes Mason (1stvamp) Mon, 01 Dec 2014 16:38:00 +0000 377 | 378 | conn-check (1.0.13-13) trusty; urgency=low 379 | 380 | * Remove --no-test as not supported in dh-virtualenv on lp. 381 | 382 | -- Wes Mason (1stvamp) Mon, 01 Dec 2014 11:43:05 +0000 383 | 384 | conn-check (1.0.13-12) trusty; urgency=low 385 | 386 | * Stub out dh-test. 387 | 388 | -- Wes Mason (1stvamp) Mon, 01 Dec 2014 11:38:01 +0000 389 | 390 | conn-check (1.0.13-11) trusty; urgency=low 391 | 392 | * Stub out auto-build. 393 | 394 | -- Wes Mason (1stvamp) Mon, 01 Dec 2014 11:23:01 +0000 395 | 396 | conn-check (1.0.13-10) trusty; urgency=low 397 | 398 | * Override dh-auto-build with a different make target. 399 | 400 | -- Wes Mason (1stvamp) Mon, 01 Dec 2014 11:12:45 +0000 401 | 402 | conn-check (1.0.13-9) trusty; urgency=low 403 | 404 | * Disable post tests run in dh-make 405 | 406 | -- Wes Mason (1stvamp) Mon, 01 Dec 2014 10:59:07 +0000 407 | 408 | conn-check (1.0.13-8) trusty; urgency=low 409 | 410 | * Disablei post tests run in dh-virtualenv. 411 | 412 | -- Wes Mason (1stvamp) Mon, 01 Dec 2014 10:40:32 +0000 413 | 414 | conn-check (1.0.13-7) trusty; urgency=low 415 | 416 | * Fix dh-virtualenv spec. 417 | 418 | -- Wes Mason (1stvamp) Fri, 28 Nov 2014 15:05:01 +0000 419 | 420 | conn-check (1.0.13-6) trusty; urgency=low 421 | 422 | * First attempt at using dh-virtualenv to manage dependencies. 423 | 424 | -- Wes Mason (1stvamp) Fri, 28 Nov 2014 14:52:21 +0000 425 | 426 | conn-check (1.0.13-5) trusty; urgency=low 427 | 428 | * Add copyright file in deb format. 429 | 430 | -- Wes Mason (1stvamp) Fri, 28 Nov 2014 12:41:00 +0000 431 | 432 | conn-check (1.0.13-4) trusty; urgency=low 433 | 434 | * Fix dh_auto_install hook and removed .install hook. 435 | 436 | -- Wes Mason (1stvamp) Fri, 28 Nov 2014 11:29:00 +0000 437 | 438 | conn-check (1.0.13-3) trusty; urgency=low 439 | 440 | * Changed package Section to admin, and added Homepage. 441 | * Cleaned up package description. 442 | * Add .install hook to copy binaries into place. 443 | 444 | -- Wes Mason (1stvamp) Fri, 28 Nov 2014 12:00:00 +0000 445 | 446 | conn-check (1.0.13-2) trusty; urgency=low 447 | 448 | * package renamed to conn-check from python-conn-check 449 | 450 | -- Wes Mason (1stvamp) Thu, 27 Nov 2014 14:06:03 +0000 451 | 452 | conn-check (1.0.13-1) trusty; urgency=low 453 | 454 | * source package automatically created by stdeb 0.8.2 455 | * dependencies modified to Debian/Ubuntu python-* packages 456 | 457 | -- Wes Mason (1stvamp) Thu, 27 Nov 2014 14:06:03 +0000 458 | -------------------------------------------------------------------------------- /tests.py: -------------------------------------------------------------------------------- 1 | import operator 2 | import random 3 | import testtools 4 | from StringIO import StringIO 5 | 6 | from testtools import matchers 7 | 8 | from conn_check.check_impl import ( 9 | FunctionCheck, 10 | MultiCheck, 11 | parallel_strategy, 12 | PrefixCheckWrapper, 13 | sequential_strategy, 14 | ) 15 | from conn_check.checks import ( 16 | CHECKS, 17 | extract_host_port, 18 | make_amqp_check, 19 | make_http_check, 20 | make_memcache_check, 21 | make_mongodb_check, 22 | make_postgres_check, 23 | make_redis_check, 24 | make_smtp_check, 25 | make_tls_check, 26 | make_tcp_check, 27 | make_udp_check, 28 | ) 29 | from conn_check.main import ( 30 | build_checks, 31 | check_from_description, 32 | OrderedOutput, 33 | ) 34 | 35 | 36 | class FunctionCheckMatcher(testtools.Matcher): 37 | 38 | def __init__(self, name, info, blocking=False): 39 | self.name = name 40 | self.info = info 41 | self.blocking = blocking 42 | 43 | def match(self, matchee): 44 | checks = [] 45 | checks.append(matchers.IsInstance(FunctionCheck)) 46 | checks.append(matchers.Annotate( 47 | "name doesn't match", 48 | matchers.AfterPreprocessing(operator.attrgetter('name'), 49 | matchers.Equals(self.name)))) 50 | checks.append(matchers.Annotate( 51 | "info doesn't match", 52 | matchers.AfterPreprocessing(operator.attrgetter('info'), 53 | matchers.Equals(self.info)))) 54 | checks.append(matchers.Annotate( 55 | "blocking doesn't match", 56 | matchers.AfterPreprocessing(operator.attrgetter('blocking'), 57 | matchers.Equals(self.blocking)))) 58 | return matchers.MatchesAll(*checks).match(matchee) 59 | 60 | def __str__(self): 61 | return ("Is a FunctionCheck with ".format(self.name, self.info, self.blocking)) 63 | 64 | 65 | class MultiCheckMatcher(testtools.Matcher): 66 | 67 | def __init__(self, strategy, subchecks): 68 | self.strategy = strategy 69 | self.subchecks = subchecks 70 | 71 | def match(self, matchee): 72 | checks = [] 73 | checks.append(matchers.IsInstance(MultiCheck)) 74 | checks.append(matchers.AfterPreprocessing(operator.attrgetter('strategy'), 75 | matchers.Is(self.strategy))) 76 | checks.append(matchers.AfterPreprocessing(operator.attrgetter('subchecks'), 77 | matchers.MatchesListwise(self.subchecks))) 78 | return matchers.MatchesAll(*checks).match(matchee) 79 | 80 | def __str__(self): 81 | return ("Is a MultiCheck with " 82 | "".format(self.strategy, self.subchecks)) 83 | 84 | 85 | class ExtractHostPortTests(testtools.TestCase): 86 | 87 | def test_basic(self): 88 | self.assertEqual(extract_host_port('http://localhost:80/'), 89 | ('localhost', 80, 'http')) 90 | 91 | def test_no_scheme(self): 92 | self.assertEqual(extract_host_port('//localhost/'), 93 | ('localhost', 80, 'http')) 94 | 95 | def test_no_port_http(self): 96 | self.assertEqual(extract_host_port('http://localhost/'), 97 | ('localhost', 80, 'http')) 98 | 99 | def test_no_port_https(self): 100 | self.assertEqual(extract_host_port('https://localhost/'), 101 | ('localhost', 443, 'https')) 102 | 103 | 104 | class ConnCheckTest(testtools.TestCase): 105 | 106 | def test_make_tcp_check(self): 107 | result = make_tcp_check('localhost', 8080) 108 | self.assertThat(result, FunctionCheckMatcher('tcp:localhost:8080', 'localhost:8080')) 109 | 110 | def test_make_tls_check(self): 111 | result = make_tls_check('localhost', 8080, verify=True) 112 | self.assertThat(result, FunctionCheckMatcher('tls:localhost:8080', 'localhost:8080')) 113 | 114 | def test_make_udp_check(self): 115 | result = make_udp_check('localhost', 8080, 'foo', 'bar') 116 | self.assertThat(result, FunctionCheckMatcher('udp:localhost:8080', 'localhost:8080')) 117 | 118 | def test_make_http_check(self): 119 | result = make_http_check('http://localhost/') 120 | self.assertIsInstance(result, PrefixCheckWrapper) 121 | self.assertEqual(result.prefix, 'http:http://localhost/:') 122 | wrapped = result.wrapped 123 | self.assertIsInstance(wrapped, MultiCheck) 124 | self.assertIs(wrapped.strategy, sequential_strategy) 125 | self.assertEqual(len(wrapped.subchecks), 2) 126 | self.assertThat(wrapped.subchecks[0], 127 | FunctionCheckMatcher('tcp:localhost:80', 'localhost:80')) 128 | self.assertThat(wrapped.subchecks[1], 129 | FunctionCheckMatcher('', 'GET http://localhost/')) 130 | 131 | def test_make_http_check_https(self): 132 | result = make_http_check('https://localhost/') 133 | self.assertIsInstance(result, PrefixCheckWrapper) 134 | self.assertEqual(result.prefix, 'http:https://localhost/:') 135 | wrapped = result.wrapped 136 | self.assertIsInstance(wrapped, MultiCheck) 137 | self.assertIs(wrapped.strategy, sequential_strategy) 138 | self.assertEqual(len(wrapped.subchecks), 2) 139 | self.assertThat(wrapped.subchecks[0], 140 | FunctionCheckMatcher('tcp:localhost:443', 'localhost:443')) 141 | self.assertThat(wrapped.subchecks[1], 142 | FunctionCheckMatcher('', 'GET https://localhost/')) 143 | 144 | def test_make_amqp_check(self): 145 | result = make_amqp_check('localhost', 8080, 'foo', 146 | 'bar', use_tls=True, vhost='/') 147 | self.assertIsInstance(result, MultiCheck) 148 | self.assertIs(result.strategy, sequential_strategy) 149 | self.assertEqual(len(result.subchecks), 3) 150 | self.assertThat(result.subchecks[0], 151 | FunctionCheckMatcher('tcp:localhost:8080', 'localhost:8080')) 152 | self.assertThat(result.subchecks[1], 153 | FunctionCheckMatcher('tls:localhost:8080', 'localhost:8080')) 154 | self.assertThat(result.subchecks[2], 155 | FunctionCheckMatcher('amqp:localhost:8080', 'user foo')) 156 | 157 | def test_make_amqp_check_no_tls(self): 158 | result = make_amqp_check('localhost', 8080, 'foo', 159 | 'bar', use_tls=False, vhost='/') 160 | self.assertIsInstance(result, MultiCheck) 161 | self.assertIs(result.strategy, sequential_strategy) 162 | self.assertEqual(len(result.subchecks), 2) 163 | self.assertThat(result.subchecks[0], 164 | FunctionCheckMatcher('tcp:localhost:8080', 'localhost:8080')) 165 | self.assertThat(result.subchecks[1], 166 | FunctionCheckMatcher('amqp:localhost:8080', 'user foo')) 167 | 168 | def test_make_postgres_check(self): 169 | result = make_postgres_check('localhost', 8080,'foo', 170 | 'bar', 'test') 171 | self.assertIsInstance(result, MultiCheck) 172 | self.assertIs(result.strategy, sequential_strategy) 173 | self.assertEqual(len(result.subchecks), 2) 174 | self.assertThat(result.subchecks[0], 175 | FunctionCheckMatcher('tcp:localhost:8080', 'localhost:8080')) 176 | self.assertThat(result.subchecks[1], 177 | FunctionCheckMatcher('postgres:localhost:8080', 'user foo', blocking=True)) 178 | 179 | def test_make_postgres_check_local_socket(self): 180 | result = make_postgres_check('/local.sock', 8080,'foo', 181 | 'bar', 'test') 182 | self.assertIsInstance(result, MultiCheck) 183 | self.assertIs(result.strategy, sequential_strategy) 184 | self.assertEqual(len(result.subchecks), 1) 185 | self.assertThat(result.subchecks[0], 186 | FunctionCheckMatcher('postgres:/local.sock:8080', 'user foo', blocking=True)) 187 | 188 | def test_make_redis_check(self): 189 | result = make_redis_check('localhost', 8080) 190 | self.assertIsInstance(result, PrefixCheckWrapper) 191 | self.assertEqual(result.prefix, 'redis:localhost:8080:') 192 | wrapped = result.wrapped 193 | self.assertIsInstance(wrapped, MultiCheck) 194 | self.assertIs(wrapped.strategy, sequential_strategy) 195 | self.assertEqual(len(wrapped.subchecks), 2) 196 | self.assertThat(wrapped.subchecks[0], 197 | FunctionCheckMatcher('tcp:localhost:8080', 'localhost:8080')) 198 | self.assertThat(wrapped.subchecks[1], FunctionCheckMatcher('connect', None)) 199 | 200 | def test_make_redis_check_with_password(self): 201 | result = make_redis_check('localhost', 8080, 'foobar') 202 | self.assertIsInstance(result, PrefixCheckWrapper) 203 | self.assertEqual(result.prefix, 'redis:localhost:8080:') 204 | wrapped = result.wrapped 205 | self.assertIsInstance(wrapped, MultiCheck) 206 | self.assertIs(wrapped.strategy, sequential_strategy) 207 | self.assertEqual(len(wrapped.subchecks), 2) 208 | self.assertThat(wrapped.subchecks[0], 209 | FunctionCheckMatcher('tcp:localhost:8080', 'localhost:8080')) 210 | self.assertThat(wrapped.subchecks[1], 211 | FunctionCheckMatcher('connect with auth', None)) 212 | 213 | def test_make_memcache_check(self): 214 | result = make_memcache_check('localhost', 8080) 215 | self.assertIsInstance(result, PrefixCheckWrapper) 216 | self.assertEqual(result.prefix, 'memcache:localhost:8080:') 217 | wrapped = result.wrapped 218 | self.assertIsInstance(wrapped, MultiCheck) 219 | self.assertIs(wrapped.strategy, sequential_strategy) 220 | self.assertEqual(len(wrapped.subchecks), 2) 221 | self.assertThat(wrapped.subchecks[0], 222 | FunctionCheckMatcher('tcp:localhost:8080', 'localhost:8080')) 223 | self.assertThat(wrapped.subchecks[1], FunctionCheckMatcher('connect', None)) 224 | 225 | def test_make_mongodb_check(self): 226 | result = make_mongodb_check('localhost', 8080) 227 | self.assertIsInstance(result, PrefixCheckWrapper) 228 | self.assertEqual(result.prefix, 'mongodb:localhost:8080:') 229 | wrapped = result.wrapped 230 | self.assertIsInstance(wrapped, MultiCheck) 231 | self.assertIs(wrapped.strategy, sequential_strategy) 232 | self.assertEqual(len(wrapped.subchecks), 2) 233 | self.assertThat(wrapped.subchecks[0], 234 | FunctionCheckMatcher('tcp:localhost:8080', 'localhost:8080')) 235 | self.assertThat(wrapped.subchecks[1], FunctionCheckMatcher('connect', None)) 236 | 237 | def test_make_mongodb_check_with_username(self): 238 | result = make_mongodb_check('localhost', 8080, 'foo') 239 | self.assertIsInstance(result, PrefixCheckWrapper) 240 | self.assertEqual(result.prefix, 'mongodb:localhost:8080:') 241 | wrapped = result.wrapped 242 | self.assertIsInstance(wrapped, MultiCheck) 243 | self.assertIs(wrapped.strategy, sequential_strategy) 244 | self.assertEqual(len(wrapped.subchecks), 2) 245 | self.assertThat(wrapped.subchecks[0], 246 | FunctionCheckMatcher('tcp:localhost:8080', 'localhost:8080')) 247 | self.assertThat(wrapped.subchecks[1], 248 | FunctionCheckMatcher('connect with auth', None)) 249 | 250 | def test_make_smtp_check(self): 251 | result = make_smtp_check('localhost', 8080, 'foo', 'bar', 252 | 'foo@example.com', 'bax@example.com', 253 | use_tls=True) 254 | self.assertIsInstance(result, MultiCheck) 255 | self.assertIs(result.strategy, sequential_strategy) 256 | self.assertEqual(len(result.subchecks), 3) 257 | self.assertThat(result.subchecks[0], 258 | FunctionCheckMatcher('tcp:localhost:8080', 'localhost:8080')) 259 | self.assertThat(result.subchecks[1], 260 | FunctionCheckMatcher('tls:localhost:8080', 'localhost:8080')) 261 | self.assertThat(result.subchecks[2], 262 | FunctionCheckMatcher('smtp:localhost:8080', 'user foo')) 263 | 264 | def test_make_smtp_check_no_tls(self): 265 | result = make_smtp_check('localhost', 8080, 'foo', 'bar', 266 | 'foo@example.com', 'bax@example.com', 267 | use_tls=False) 268 | self.assertIsInstance(result, MultiCheck) 269 | self.assertIs(result.strategy, sequential_strategy) 270 | self.assertEqual(len(result.subchecks), 2) 271 | self.assertThat(result.subchecks[0], 272 | FunctionCheckMatcher('tcp:localhost:8080', 'localhost:8080')) 273 | self.assertThat(result.subchecks[1], 274 | FunctionCheckMatcher('smtp:localhost:8080', 'user foo')) 275 | 276 | def test_check_from_description_unknown_type(self): 277 | e = self.assertRaises(AssertionError, 278 | check_from_description, {'type': 'foo'}) 279 | self.assertEqual( 280 | str(e), 281 | "Unknown check type: foo, available checks: {}".format(CHECKS.keys())) 282 | 283 | def test_check_from_description_missing_arg(self): 284 | description = {'type': 'tcp'} 285 | e = self.assertRaises(AssertionError, 286 | check_from_description, description) 287 | self.assertEqual( 288 | str(e), 289 | "host missing from check: {}".format(description)) 290 | 291 | def test_check_from_description_makes_check(self): 292 | description = {'type': 'tcp', 'host': 'localhost', 'port': '8080'} 293 | result = check_from_description(description) 294 | self.assertThat(result, 295 | FunctionCheckMatcher('tcp:localhost:8080', 'localhost:8080')) 296 | 297 | def test_build_checks(self): 298 | description = [{'type': 'tcp', 'host': 'localhost', 'port': '8080'}] 299 | result = build_checks(description, 10, [], []) 300 | self.assertThat(result, 301 | MultiCheckMatcher(strategy=parallel_strategy, 302 | subchecks=[FunctionCheckMatcher('tcp:localhost:8080', 'localhost:8080')])) 303 | 304 | def test_build_checks_with_tags(self): 305 | descriptions = [ 306 | {'type': 'tcp', 'host': 'localhost', 'port': '8080'}, 307 | {'type': 'tcp', 'host': 'localhost2', 'port': '8080', 308 | 'tags': ['foo']}, 309 | {'type': 'tcp', 'host': 'localhost3', 'port': '8080', 310 | 'tags': ['foo', 'bar']}, 311 | {'type': 'tcp', 'host': 'localhost4', 'port': '8080', 312 | 'tags': ['baz']}, 313 | {'type': 'tcp', 'host': 'localhost5', 'port': '8080', 314 | 'tags': ['bar']}, 315 | ] 316 | result = build_checks(descriptions, 10, ['foo', 'bar'], []) 317 | expected_subchecks = [ 318 | FunctionCheckMatcher('tcp:localhost2:8080', 'localhost2:8080'), 319 | FunctionCheckMatcher('tcp:localhost3:8080', 'localhost3:8080'), 320 | FunctionCheckMatcher('tcp:localhost5:8080', 'localhost5:8080'), 321 | ] 322 | self.assertThat(result, 323 | MultiCheckMatcher(strategy=parallel_strategy, 324 | subchecks=expected_subchecks)) 325 | 326 | def test_build_checks_with_excluded_tags(self): 327 | descriptions = [ 328 | {'type': 'tcp', 'host': 'localhost', 'port': '8080'}, 329 | {'type': 'tcp', 'host': 'localhost2', 'port': '8080', 330 | 'tags': ['foo']}, 331 | {'type': 'tcp', 'host': 'localhost3', 'port': '8080', 332 | 'tags': ['foo', 'bar']}, 333 | {'type': 'tcp', 'host': 'localhost4', 'port': '8080', 334 | 'tags': ['baz']}, 335 | {'type': 'tcp', 'host': 'localhost5', 'port': '8080', 336 | 'tags': ['bar']}, 337 | ] 338 | result = build_checks(descriptions, 10, [], ['bar', 'baz']) 339 | expected_subchecks = [ 340 | FunctionCheckMatcher('tcp:localhost:8080', 'localhost:8080'), 341 | FunctionCheckMatcher('tcp:localhost2:8080', 'localhost2:8080'), 342 | ] 343 | self.assertThat(result, 344 | MultiCheckMatcher(strategy=parallel_strategy, 345 | subchecks=expected_subchecks)) 346 | 347 | def test_ordered_output(self): 348 | lines = [ 349 | 'SKIPPED: xyz3:localhost:666\n', 350 | 'bar2:localhost:8080 FAILED: error\n', 351 | 'SKIPPED: foo2:localhost:8080\n', 352 | 'baz2:localhost:42 OK\n', 353 | 'SKIPPED: bar2:localhost:8080\n', 354 | 'xyz2:localhost:666 FAILED: error\n', 355 | 'xyz1:localhost:666 OK\n', 356 | 'foo1:localhost:8080 FAILED: error\n', 357 | 'baz1:localhost:42 OK\n', 358 | ] 359 | expected = ( 360 | 'bar2:localhost:8080 FAILED: error\n' 361 | 'foo1:localhost:8080 FAILED: error\n' 362 | 'xyz2:localhost:666 FAILED: error\n' 363 | 'baz1:localhost:42 OK\n' 364 | 'baz2:localhost:42 OK\n' 365 | 'xyz1:localhost:666 OK\n' 366 | 'SKIPPED: bar2:localhost:8080\n' 367 | 'SKIPPED: foo2:localhost:8080\n' 368 | 'SKIPPED: xyz3:localhost:666\n' 369 | ) 370 | 371 | output = OrderedOutput(StringIO()) 372 | map(output.write, lines) 373 | output.flush() 374 | self.assertEqual(expected, output.output.getvalue()) 375 | 376 | output = OrderedOutput(StringIO()) 377 | random.shuffle(lines) 378 | map(output.write, lines) 379 | output.flush() 380 | self.assertEqual(expected, output.output.getvalue()) 381 | -------------------------------------------------------------------------------- /conn_check/checks.py: -------------------------------------------------------------------------------- 1 | from email.mime.text import MIMEText 2 | import glob 3 | import os 4 | from pkg_resources import resource_stream 5 | from StringIO import StringIO 6 | import urlparse 7 | 8 | from OpenSSL import SSL 9 | from OpenSSL.crypto import load_certificate, FILETYPE_PEM 10 | 11 | from twisted.internet import reactor, ssl 12 | from twisted.internet.error import DNSLookupError, TimeoutError 13 | from twisted.internet.abstract import isIPAddress 14 | from twisted.internet.defer import ( 15 | Deferred, 16 | inlineCallbacks, 17 | ) 18 | from twisted.internet.protocol import ( 19 | ClientCreator, 20 | DatagramProtocol, 21 | Protocol, 22 | ) 23 | from twisted.mail.smtp import ESMTPSenderFactory 24 | from twisted.protocols.memcache import MemCacheProtocol 25 | 26 | from txrequests import Session 27 | from requests.auth import HTTPDigestAuth 28 | 29 | try: 30 | from requests.packages.urllib3 import disable_warnings 31 | from requests.packages.urllib3.contrib.pyopenssl import inject_into_urllib3 32 | except ImportError: 33 | from urllib3 import disable_warnings 34 | from urllib3.contrib.pyopenssl import inject_into_urllib3 35 | 36 | from .check_impl import ( 37 | add_check_prefix, 38 | make_check, 39 | sequential_check, 40 | ) 41 | 42 | 43 | # Ensure we always use pyOpenSSL instead of the ssl builtin 44 | inject_into_urllib3() 45 | 46 | CA_CERTS = [] 47 | 48 | 49 | def load_tls_certs(path): 50 | cert_map = {} 51 | for filepath in glob.glob("{}/*.pem".format(os.path.abspath(path))): 52 | # There might be some dead symlinks in there, 53 | # so let's make sure it's real. 54 | if os.path.isfile(filepath): 55 | data = open(filepath).read() 56 | x509 = load_certificate(FILETYPE_PEM, data) 57 | # Now, de-duplicate in case the same cert has multiple names. 58 | cert_map[x509.digest('sha1')] = x509 59 | 60 | CA_CERTS.extend(cert_map.values()) 61 | 62 | 63 | class TCPCheckProtocol(Protocol): 64 | 65 | def connectionMade(self): 66 | self.transport.loseConnection() 67 | 68 | 69 | class VerifyingContextFactory(ssl.CertificateOptions): 70 | 71 | def __init__(self, verify, caCerts, verifyCallback=None): 72 | ssl.CertificateOptions.__init__(self, verify=verify, 73 | caCerts=caCerts, 74 | method=SSL.SSLv23_METHOD) 75 | self.verifyCallback = verifyCallback 76 | 77 | def _makeContext(self): 78 | context = ssl.CertificateOptions._makeContext(self) 79 | if self.verifyCallback is not None: 80 | context.set_verify( 81 | SSL.VERIFY_PEER | SSL.VERIFY_FAIL_IF_NO_PEER_CERT, 82 | self.verifyCallback) 83 | return context 84 | 85 | 86 | @inlineCallbacks 87 | def do_tcp_check(host, port, tls=False, tls_verify=True, 88 | timeout=None): 89 | """Generic connection check function.""" 90 | if not isIPAddress(host): 91 | try: 92 | ip = yield reactor.resolve(host, timeout=(1, timeout)) 93 | except DNSLookupError: 94 | raise ValueError("dns resolution failed") 95 | else: 96 | ip = host 97 | creator = ClientCreator(reactor, TCPCheckProtocol) 98 | try: 99 | if tls: 100 | context = VerifyingContextFactory(tls_verify, CA_CERTS) 101 | yield creator.connectSSL(ip, port, context, 102 | timeout=timeout) 103 | else: 104 | yield creator.connectTCP(ip, port, timeout=timeout) 105 | except TimeoutError: 106 | if ip == host: 107 | raise ValueError("timed out") 108 | else: 109 | raise ValueError("timed out connecting to {}".format(ip)) 110 | 111 | 112 | def make_tcp_check(host, port, timeout=None, **kwargs): 113 | """Return a check for TCP connectivity.""" 114 | return make_check("tcp:{}:{}".format(host, port), 115 | lambda: do_tcp_check(host, port, timeout=timeout), 116 | info="{}:{}".format(host, port)) 117 | 118 | 119 | def make_tls_check(host, port, disable_tls_verification=False, timeout=None, 120 | **kwargs): 121 | """Return a check for TLS setup.""" 122 | 123 | verify = not disable_tls_verification 124 | check = make_check("tls:{}:{}".format(host, port), 125 | lambda: do_tcp_check(host, port, tls=True, 126 | tls_verify=verify, 127 | timeout=timeout), 128 | info="{}:{}".format(host, port)) 129 | 130 | return check 131 | 132 | 133 | class UDPCheckProtocol(DatagramProtocol): 134 | 135 | def __init__(self, host, port, send, expect, deferred=None, 136 | timeout=None): 137 | self.host = host 138 | self.port = port 139 | self.send = send 140 | self.expect = expect 141 | self.deferred = deferred 142 | self.timeout = timeout 143 | 144 | def _finish(self, success, result): 145 | if not (self.delayed.cancelled or self.delayed.called): 146 | self.delayed.cancel() 147 | if self.deferred is not None: 148 | if success: 149 | self.deferred.callback(result) 150 | else: 151 | self.deferred.errback(result) 152 | self.deferred = None 153 | 154 | def startProtocol(self): 155 | self.transport.write(self.send, (self.host, self.port)) 156 | self.delayed = reactor.callLater(self.timeout, 157 | self._finish, 158 | False, TimeoutError()) 159 | 160 | def datagramReceived(self, datagram, addr): 161 | if datagram == self.expect: 162 | self._finish(True, True) 163 | else: 164 | self._finish(False, ValueError("unexpected reply")) 165 | 166 | 167 | @inlineCallbacks 168 | def do_udp_check(host, port, send, expect, timeout=None): 169 | """Generic connection check function.""" 170 | if not isIPAddress(host): 171 | try: 172 | ip = yield reactor.resolve(host, timeout=(1, timeout)) 173 | except DNSLookupError: 174 | raise ValueError("dns resolution failed") 175 | else: 176 | ip = host 177 | deferred = Deferred() 178 | protocol = UDPCheckProtocol(ip, port, send, expect, deferred, timeout) 179 | reactor.listenUDP(0, protocol) 180 | try: 181 | yield deferred 182 | except TimeoutError: 183 | if ip == host: 184 | raise ValueError("timed out") 185 | else: 186 | raise ValueError("timed out waiting for {}".format(ip)) 187 | 188 | 189 | def make_udp_check(host, port, send, expect, timeout=None, 190 | **kwargs): 191 | """Return a check for UDP connectivity.""" 192 | return make_check("udp:{}:{}".format(host, port), 193 | lambda: do_udp_check(host, port, send, expect, timeout), 194 | info="{}:{}".format(host, port)) 195 | 196 | 197 | def extract_host_port(url): 198 | parsed = urlparse.urlparse(url) 199 | host = parsed.hostname 200 | port = parsed.port 201 | scheme = parsed.scheme 202 | if not scheme: 203 | scheme = 'http' 204 | if port is None: 205 | if scheme == 'https': 206 | port = 443 207 | else: 208 | port = 80 209 | return host, port, scheme 210 | 211 | 212 | def make_http_check(url, method='GET', expected_code=200, **kwargs): 213 | subchecks = [] 214 | host, port, scheme = extract_host_port(url) 215 | proxy_url = kwargs.get('proxy_url') 216 | proxy_host = kwargs.get('proxy_host') 217 | proxy_port = int(kwargs.get('proxy_port', 8000)) 218 | timeout = kwargs.get('timeout', None) 219 | expected_code = int(expected_code) 220 | 221 | if proxy_host: 222 | subchecks.append(make_tcp_check(proxy_host, proxy_port, 223 | timeout=timeout)) 224 | else: 225 | subchecks.append(make_tcp_check(host, port, timeout=timeout)) 226 | 227 | @inlineCallbacks 228 | def do_request(): 229 | proxies = {} 230 | if proxy_url: 231 | proxies['http'] = proxies['https'] = proxy_url 232 | elif proxy_host: 233 | proxies['http'] = proxies['https'] = '{}:{}'.format( 234 | proxy_host, proxy_port) 235 | 236 | headers = kwargs.get('headers') 237 | body = kwargs.get('body') 238 | disable_tls_verification = kwargs.get('disable_tls_verification', 239 | False) 240 | allow_redirects = kwargs.get('allow_redirects', False) 241 | params = kwargs.get('params') 242 | cookies = kwargs.get('cookies') 243 | auth = kwargs.get('auth') 244 | digest_auth = kwargs.get('digest_auth') 245 | 246 | args = { 247 | 'method': method, 248 | 'url': url, 249 | 'verify': not disable_tls_verification, 250 | 'timeout': timeout, 251 | 'allow_redirects': allow_redirects, 252 | } 253 | if headers: 254 | args['headers'] = headers 255 | if body: 256 | args['data'] = body 257 | if proxies: 258 | args['proxies'] = proxies 259 | if params: 260 | args['params'] = params 261 | if cookies: 262 | args['cookies'] = cookies 263 | if auth: 264 | args['auth'] = auth 265 | if digest_auth: 266 | args['auth'] = HTTPDigestAuth(digest_auth) 267 | 268 | if disable_tls_verification: 269 | disable_warnings() 270 | 271 | with Session() as session: 272 | request = session.request(**args) 273 | 274 | response = yield request 275 | if response.status_code != expected_code: 276 | raise RuntimeError( 277 | "Unexpected response code: {}".format( 278 | response.status_code)) 279 | 280 | subchecks.append(make_check('', do_request, 281 | info='{} {}'.format(method, url))) 282 | 283 | return add_check_prefix('http:{}'.format(url), 284 | sequential_check(subchecks)) 285 | 286 | 287 | def make_amqp_check(host, port, username, password, use_tls=True, vhost="/", 288 | timeout=None, **kwargs): 289 | """Return a check for AMQP connectivity.""" 290 | from txamqp.protocol import AMQClient 291 | from txamqp.client import TwistedDelegate 292 | from txamqp.spec import load as load_spec 293 | 294 | subchecks = [] 295 | subchecks.append(make_tcp_check(host, port, timeout=timeout)) 296 | 297 | if use_tls: 298 | subchecks.append(make_tls_check(host, port, verify=False, 299 | timeout=timeout)) 300 | 301 | @inlineCallbacks 302 | def do_auth(): 303 | """Connect and authenticate.""" 304 | delegate = TwistedDelegate() 305 | spec = load_spec(resource_stream('conn_check', 'amqp0-8.xml')) 306 | creator = ClientCreator(reactor, AMQClient, 307 | delegate, vhost, spec) 308 | client = yield creator.connectTCP(host, port, timeout=timeout) 309 | yield client.authenticate(username, password) 310 | 311 | subchecks.append(make_check("amqp:{}:{}".format(host, port), 312 | do_auth, info="user {}".format(username),)) 313 | return sequential_check(subchecks) 314 | 315 | 316 | def make_smtp_check(host, port, username, password, from_address, to_address, 317 | message='', subject='', helo_fallback=False, use_tls=True, 318 | timeout=None, **kwargs): 319 | """Return a check for SMTP connectivity.""" 320 | 321 | subchecks = [] 322 | subchecks.append(make_tcp_check(host, port, timeout=timeout)) 323 | 324 | if use_tls: 325 | subchecks.append(make_tls_check(host, port, verify=False, 326 | timeout=timeout)) 327 | 328 | @inlineCallbacks 329 | def do_connect(): 330 | """Connect and authenticate.""" 331 | result_deferred = Deferred() 332 | context_factory = None 333 | if use_tls: 334 | from twisted.internet import ssl as twisted_ssl 335 | context_factory = twisted_ssl.ClientContextFactory() 336 | 337 | body = MIMEText(message) 338 | body['Subject'] = subject 339 | factory = ESMTPSenderFactory( 340 | username, 341 | password, 342 | from_address, 343 | to_address, 344 | StringIO(body.as_string()), 345 | result_deferred, 346 | contextFactory=context_factory, 347 | requireTransportSecurity=use_tls, 348 | requireAuthentication=True, 349 | heloFallback=helo_fallback) 350 | 351 | if use_tls: 352 | reactor.connectSSL(host, port, factory, context_factory) 353 | else: 354 | reactor.connectTCP(host, port, factory) 355 | result = yield result_deferred 356 | 357 | if result[0] == 0: 358 | raise RuntimeError("failed to send email via smtp") 359 | 360 | subchecks.append(make_check("smtp:{}:{}".format(host, port), 361 | do_connect, info="user {}".format(username),)) 362 | return sequential_check(subchecks) 363 | 364 | 365 | def make_postgres_check(host, port, username, password, database, 366 | timeout=None, **kwargs): 367 | """Return a check for Postgres connectivity.""" 368 | 369 | import psycopg2 370 | subchecks = [] 371 | connect_kw = { 372 | 'host': host, 373 | 'user': username, 374 | 'database': database, 375 | 'connect_timeout': timeout, 376 | } 377 | 378 | if host[0] != '/': 379 | connect_kw['port'] = port 380 | subchecks.append(make_tcp_check(host, port, timeout=timeout)) 381 | 382 | if password is not None: 383 | connect_kw['password'] = password 384 | 385 | def check_auth(): 386 | """Try to establish a postgres connection and log in.""" 387 | conn = psycopg2.connect(**connect_kw) 388 | conn.close() 389 | 390 | subchecks.append(make_check("postgres:{}:{}".format(host, port), 391 | check_auth, info="user {}".format(username), 392 | blocking=True)) 393 | 394 | return sequential_check(subchecks) 395 | 396 | 397 | def make_redis_check(host, port, password=None, timeout=None, 398 | **kwargs): 399 | """Make a check for the configured redis server.""" 400 | import txredis 401 | subchecks = [] 402 | subchecks.append(make_tcp_check(host, port, timeout=timeout)) 403 | 404 | @inlineCallbacks 405 | def do_connect(): 406 | """Connect and authenticate. 407 | """ 408 | client_creator = ClientCreator(reactor, txredis.client.RedisClient) 409 | client = yield client_creator.connectTCP(host=host, port=port, 410 | timeout=timeout) 411 | 412 | if password is None: 413 | ping = yield client.ping() 414 | if not ping: 415 | raise RuntimeError("failed to ping redis") 416 | else: 417 | resp = yield client.auth(password) 418 | if resp != 'OK': 419 | raise RuntimeError("failed to auth to redis") 420 | 421 | if password is not None: 422 | connect_info = "connect with auth" 423 | else: 424 | connect_info = "connect" 425 | subchecks.append(make_check(connect_info, do_connect)) 426 | 427 | return add_check_prefix('redis:{}:{}'.format(host, port), 428 | sequential_check(subchecks)) 429 | 430 | 431 | def make_memcache_check(host, port, password=None, timeout=None, 432 | **kwargs): 433 | """Make a check for the configured redis server.""" 434 | subchecks = [] 435 | subchecks.append(make_tcp_check(host, port, timeout=timeout)) 436 | 437 | @inlineCallbacks 438 | def do_connect(): 439 | """Connect and authenticate.""" 440 | client_creator = ClientCreator(reactor, MemCacheProtocol) 441 | client = yield client_creator.connectTCP(host=host, port=port, 442 | timeout=timeout) 443 | 444 | version = yield client.version() 445 | if version is None: 446 | raise RuntimeError('Failed retrieve memcached server version') 447 | 448 | subchecks.append(make_check('connect', do_connect)) 449 | 450 | return add_check_prefix('memcache:{}:{}'.format(host, port), 451 | sequential_check(subchecks)) 452 | 453 | 454 | def make_mongodb_check(host, port=27017, username=None, password=None, 455 | database='test', timeout=10, **kwargs): 456 | """Return a check for MongoDB connectivity.""" 457 | 458 | import txmongo 459 | subchecks = [] 460 | subchecks.append(make_tcp_check(host, port, timeout=timeout)) 461 | 462 | port = int(port) 463 | 464 | @inlineCallbacks 465 | def do_connect(): 466 | """Try to establish a mongodb connection.""" 467 | conn = txmongo.MongoConnection(host, port) 468 | 469 | conn.uri['options']['connectTimeoutMS'] = int(timeout*1000) 470 | if username: 471 | conn.uri['username'] = username 472 | if password: 473 | conn.uri['password'] = password 474 | 475 | # We don't start our timeout callback until now, otherwise we might 476 | # elapse part of our timeout period during the earlier TCP check 477 | reactor.callLater(timeout, timeout_handler) 478 | 479 | mongo = yield conn 480 | names = yield mongo[database].collection_names() 481 | if names is None: 482 | raise RuntimeError('Failed to retrieve collection names') 483 | 484 | def timeout_handler(): 485 | """Manual timeout handler as txmongo timeout args don't always work.""" 486 | if 'deferred' in do_connect.func_dict: 487 | err = ValueError("timeout connecting to mongodb") 488 | do_connect.func_dict['deferred'].errback(err) 489 | 490 | if any((username, password)): 491 | connect_info = "connect with auth" 492 | else: 493 | connect_info = "connect" 494 | subchecks.append(make_check(connect_info, do_connect)) 495 | 496 | return add_check_prefix('mongodb:{}:{}'.format(host, port), 497 | sequential_check(subchecks)) 498 | 499 | 500 | CHECKS = { 501 | 'tcp': { 502 | 'fn': make_tcp_check, 503 | 'args': ['host', 'port'], 504 | }, 505 | 'tls': { 506 | 'fn': make_tls_check, 507 | 'args': ['host', 'port'], 508 | }, 509 | 'udp': { 510 | 'fn': make_udp_check, 511 | 'args': ['host', 'port', 'send', 'expect'], 512 | }, 513 | 'http': { 514 | 'fn': make_http_check, 515 | 'args': ['url'], 516 | }, 517 | 'amqp': { 518 | 'fn': make_amqp_check, 519 | 'args': ['host', 'port', 'username', 'password'], 520 | }, 521 | 'postgres': { 522 | 'fn': make_postgres_check, 523 | 'args': ['host', 'port', 'username', 'password', 'database'], 524 | }, 525 | 'redis': { 526 | 'fn': make_redis_check, 527 | 'args': ['host', 'port'], 528 | }, 529 | 'memcache': { 530 | 'fn': make_memcache_check, 531 | 'args': ['host', 'port'], 532 | }, 533 | 'mongodb': { 534 | 'fn': make_mongodb_check, 535 | 'args': ['host'], 536 | }, 537 | 'smtp': { 538 | 'fn': make_smtp_check, 539 | 'args': ['host', 'port', 'username', 'password'], 540 | }, 541 | } 542 | 543 | CHECK_ALIASES = { 544 | 'mongo': 'mongodb', 545 | 'memcached': 'memcache', 546 | 'ssl': 'tls', 547 | 'postgresql': 'postgres', 548 | 'https': 'http', 549 | } 550 | -------------------------------------------------------------------------------- /LICENSE: -------------------------------------------------------------------------------- 1 | GNU GENERAL PUBLIC LICENSE 2 | Version 3, 29 June 2007 3 | 4 | Copyright (C) 2007 Free Software Foundation, Inc. 5 | Everyone is permitted to copy and distribute verbatim copies 6 | of this license document, but changing it is not allowed. 7 | 8 | Preamble 9 | 10 | The GNU General Public License is a free, copyleft license for 11 | software and other kinds of works. 12 | 13 | The licenses for most software and other practical works are designed 14 | to take away your freedom to share and change the works. By contrast, 15 | the GNU General Public License is intended to guarantee your freedom to 16 | share and change all versions of a program--to make sure it remains free 17 | software for all its users. We, the Free Software Foundation, use the 18 | GNU General Public License for most of our software; it applies also to 19 | any other work released this way by its authors. You can apply it to 20 | your programs, too. 21 | 22 | When we speak of free software, we are referring to freedom, not 23 | price. Our General Public Licenses are designed to make sure that you 24 | have the freedom to distribute copies of free software (and charge for 25 | them if you wish), that you receive source code or can get it if you 26 | want it, that you can change the software or use pieces of it in new 27 | free programs, and that you know you can do these things. 28 | 29 | To protect your rights, we need to prevent others from denying you 30 | these rights or asking you to surrender the rights. Therefore, you have 31 | certain responsibilities if you distribute copies of the software, or if 32 | you modify it: responsibilities to respect the freedom of others. 33 | 34 | For example, if you distribute copies of such a program, whether 35 | gratis or for a fee, you must pass on to the recipients the same 36 | freedoms that you received. You must make sure that they, too, receive 37 | or can get the source code. And you must show them these terms so they 38 | know their rights. 39 | 40 | Developers that use the GNU GPL protect your rights with two steps: 41 | (1) assert copyright on the software, and (2) offer you this License 42 | giving you legal permission to copy, distribute and/or modify it. 43 | 44 | For the developers' and authors' protection, the GPL clearly explains 45 | that there is no warranty for this free software. For both users' and 46 | authors' sake, the GPL requires that modified versions be marked as 47 | changed, so that their problems will not be attributed erroneously to 48 | authors of previous versions. 49 | 50 | Some devices are designed to deny users access to install or run 51 | modified versions of the software inside them, although the manufacturer 52 | can do so. This is fundamentally incompatible with the aim of 53 | protecting users' freedom to change the software. The systematic 54 | pattern of such abuse occurs in the area of products for individuals to 55 | use, which is precisely where it is most unacceptable. Therefore, we 56 | have designed this version of the GPL to prohibit the practice for those 57 | products. If such problems arise substantially in other domains, we 58 | stand ready to extend this provision to those domains in future versions 59 | of the GPL, as needed to protect the freedom of users. 60 | 61 | Finally, every program is threatened constantly by software patents. 62 | States should not allow patents to restrict development and use of 63 | software on general-purpose computers, but in those that do, we wish to 64 | avoid the special danger that patents applied to a free program could 65 | make it effectively proprietary. To prevent this, the GPL assures that 66 | patents cannot be used to render the program non-free. 67 | 68 | The precise terms and conditions for copying, distribution and 69 | modification follow. 70 | 71 | TERMS AND CONDITIONS 72 | 73 | 0. Definitions. 74 | 75 | "This License" refers to version 3 of the GNU General Public License. 76 | 77 | "Copyright" also means copyright-like laws that apply to other kinds of 78 | works, such as semiconductor masks. 79 | 80 | "The Program" refers to any copyrightable work licensed under this 81 | License. Each licensee is addressed as "you". "Licensees" and 82 | "recipients" may be individuals or organizations. 83 | 84 | To "modify" a work means to copy from or adapt all or part of the work 85 | in a fashion requiring copyright permission, other than the making of an 86 | exact copy. The resulting work is called a "modified version" of the 87 | earlier work or a work "based on" the earlier work. 88 | 89 | A "covered work" means either the unmodified Program or a work based 90 | on the Program. 91 | 92 | To "propagate" a work means to do anything with it that, without 93 | permission, would make you directly or secondarily liable for 94 | infringement under applicable copyright law, except executing it on a 95 | computer or modifying a private copy. Propagation includes copying, 96 | distribution (with or without modification), making available to the 97 | public, and in some countries other activities as well. 98 | 99 | To "convey" a work means any kind of propagation that enables other 100 | parties to make or receive copies. Mere interaction with a user through 101 | a computer network, with no transfer of a copy, is not conveying. 102 | 103 | An interactive user interface displays "Appropriate Legal Notices" 104 | to the extent that it includes a convenient and prominently visible 105 | feature that (1) displays an appropriate copyright notice, and (2) 106 | tells the user that there is no warranty for the work (except to the 107 | extent that warranties are provided), that licensees may convey the 108 | work under this License, and how to view a copy of this License. If 109 | the interface presents a list of user commands or options, such as a 110 | menu, a prominent item in the list meets this criterion. 111 | 112 | 1. Source Code. 113 | 114 | The "source code" for a work means the preferred form of the work 115 | for making modifications to it. "Object code" means any non-source 116 | form of a work. 117 | 118 | A "Standard Interface" means an interface that either is an official 119 | standard defined by a recognized standards body, or, in the case of 120 | interfaces specified for a particular programming language, one that 121 | is widely used among developers working in that language. 122 | 123 | The "System Libraries" of an executable work include anything, other 124 | than the work as a whole, that (a) is included in the normal form of 125 | packaging a Major Component, but which is not part of that Major 126 | Component, and (b) serves only to enable use of the work with that 127 | Major Component, or to implement a Standard Interface for which an 128 | implementation is available to the public in source code form. A 129 | "Major Component", in this context, means a major essential component 130 | (kernel, window system, and so on) of the specific operating system 131 | (if any) on which the executable work runs, or a compiler used to 132 | produce the work, or an object code interpreter used to run it. 133 | 134 | The "Corresponding Source" for a work in object code form means all 135 | the source code needed to generate, install, and (for an executable 136 | work) run the object code and to modify the work, including scripts to 137 | control those activities. However, it does not include the work's 138 | System Libraries, or general-purpose tools or generally available free 139 | programs which are used unmodified in performing those activities but 140 | which are not part of the work. For example, Corresponding Source 141 | includes interface definition files associated with source files for 142 | the work, and the source code for shared libraries and dynamically 143 | linked subprograms that the work is specifically designed to require, 144 | such as by intimate data communication or control flow between those 145 | subprograms and other parts of the work. 146 | 147 | The Corresponding Source need not include anything that users 148 | can regenerate automatically from other parts of the Corresponding 149 | Source. 150 | 151 | The Corresponding Source for a work in source code form is that 152 | same work. 153 | 154 | 2. Basic Permissions. 155 | 156 | All rights granted under this License are granted for the term of 157 | copyright on the Program, and are irrevocable provided the stated 158 | conditions are met. This License explicitly affirms your unlimited 159 | permission to run the unmodified Program. The output from running a 160 | covered work is covered by this License only if the output, given its 161 | content, constitutes a covered work. This License acknowledges your 162 | rights of fair use or other equivalent, as provided by copyright law. 163 | 164 | You may make, run and propagate covered works that you do not 165 | convey, without conditions so long as your license otherwise remains 166 | in force. You may convey covered works to others for the sole purpose 167 | of having them make modifications exclusively for you, or provide you 168 | with facilities for running those works, provided that you comply with 169 | the terms of this License in conveying all material for which you do 170 | not control copyright. Those thus making or running the covered works 171 | for you must do so exclusively on your behalf, under your direction 172 | and control, on terms that prohibit them from making any copies of 173 | your copyrighted material outside their relationship with you. 174 | 175 | Conveying under any other circumstances is permitted solely under 176 | the conditions stated below. Sublicensing is not allowed; section 10 177 | makes it unnecessary. 178 | 179 | 3. Protecting Users' Legal Rights From Anti-Circumvention Law. 180 | 181 | No covered work shall be deemed part of an effective technological 182 | measure under any applicable law fulfilling obligations under article 183 | 11 of the WIPO copyright treaty adopted on 20 December 1996, or 184 | similar laws prohibiting or restricting circumvention of such 185 | measures. 186 | 187 | When you convey a covered work, you waive any legal power to forbid 188 | circumvention of technological measures to the extent such circumvention 189 | is effected by exercising rights under this License with respect to 190 | the covered work, and you disclaim any intention to limit operation or 191 | modification of the work as a means of enforcing, against the work's 192 | users, your or third parties' legal rights to forbid circumvention of 193 | technological measures. 194 | 195 | 4. Conveying Verbatim Copies. 196 | 197 | You may convey verbatim copies of the Program's source code as you 198 | receive it, in any medium, provided that you conspicuously and 199 | appropriately publish on each copy an appropriate copyright notice; 200 | keep intact all notices stating that this License and any 201 | non-permissive terms added in accord with section 7 apply to the code; 202 | keep intact all notices of the absence of any warranty; and give all 203 | recipients a copy of this License along with the Program. 204 | 205 | You may charge any price or no price for each copy that you convey, 206 | and you may offer support or warranty protection for a fee. 207 | 208 | 5. Conveying Modified Source Versions. 209 | 210 | You may convey a work based on the Program, or the modifications to 211 | produce it from the Program, in the form of source code under the 212 | terms of section 4, provided that you also meet all of these conditions: 213 | 214 | a) The work must carry prominent notices stating that you modified 215 | it, and giving a relevant date. 216 | 217 | b) The work must carry prominent notices stating that it is 218 | released under this License and any conditions added under section 219 | 7. This requirement modifies the requirement in section 4 to 220 | "keep intact all notices". 221 | 222 | c) You must license the entire work, as a whole, under this 223 | License to anyone who comes into possession of a copy. This 224 | License will therefore apply, along with any applicable section 7 225 | additional terms, to the whole of the work, and all its parts, 226 | regardless of how they are packaged. This License gives no 227 | permission to license the work in any other way, but it does not 228 | invalidate such permission if you have separately received it. 229 | 230 | d) If the work has interactive user interfaces, each must display 231 | Appropriate Legal Notices; however, if the Program has interactive 232 | interfaces that do not display Appropriate Legal Notices, your 233 | work need not make them do so. 234 | 235 | A compilation of a covered work with other separate and independent 236 | works, which are not by their nature extensions of the covered work, 237 | and which are not combined with it such as to form a larger program, 238 | in or on a volume of a storage or distribution medium, is called an 239 | "aggregate" if the compilation and its resulting copyright are not 240 | used to limit the access or legal rights of the compilation's users 241 | beyond what the individual works permit. Inclusion of a covered work 242 | in an aggregate does not cause this License to apply to the other 243 | parts of the aggregate. 244 | 245 | 6. Conveying Non-Source Forms. 246 | 247 | You may convey a covered work in object code form under the terms 248 | of sections 4 and 5, provided that you also convey the 249 | machine-readable Corresponding Source under the terms of this License, 250 | in one of these ways: 251 | 252 | a) Convey the object code in, or embodied in, a physical product 253 | (including a physical distribution medium), accompanied by the 254 | Corresponding Source fixed on a durable physical medium 255 | customarily used for software interchange. 256 | 257 | b) Convey the object code in, or embodied in, a physical product 258 | (including a physical distribution medium), accompanied by a 259 | written offer, valid for at least three years and valid for as 260 | long as you offer spare parts or customer support for that product 261 | model, to give anyone who possesses the object code either (1) a 262 | copy of the Corresponding Source for all the software in the 263 | product that is covered by this License, on a durable physical 264 | medium customarily used for software interchange, for a price no 265 | more than your reasonable cost of physically performing this 266 | conveying of source, or (2) access to copy the 267 | Corresponding Source from a network server at no charge. 268 | 269 | c) Convey individual copies of the object code with a copy of the 270 | written offer to provide the Corresponding Source. This 271 | alternative is allowed only occasionally and noncommercially, and 272 | only if you received the object code with such an offer, in accord 273 | with subsection 6b. 274 | 275 | d) Convey the object code by offering access from a designated 276 | place (gratis or for a charge), and offer equivalent access to the 277 | Corresponding Source in the same way through the same place at no 278 | further charge. You need not require recipients to copy the 279 | Corresponding Source along with the object code. If the place to 280 | copy the object code is a network server, the Corresponding Source 281 | may be on a different server (operated by you or a third party) 282 | that supports equivalent copying facilities, provided you maintain 283 | clear directions next to the object code saying where to find the 284 | Corresponding Source. Regardless of what server hosts the 285 | Corresponding Source, you remain obligated to ensure that it is 286 | available for as long as needed to satisfy these requirements. 287 | 288 | e) Convey the object code using peer-to-peer transmission, provided 289 | you inform other peers where the object code and Corresponding 290 | Source of the work are being offered to the general public at no 291 | charge under subsection 6d. 292 | 293 | A separable portion of the object code, whose source code is excluded 294 | from the Corresponding Source as a System Library, need not be 295 | included in conveying the object code work. 296 | 297 | A "User Product" is either (1) a "consumer product", which means any 298 | tangible personal property which is normally used for personal, family, 299 | or household purposes, or (2) anything designed or sold for incorporation 300 | into a dwelling. In determining whether a product is a consumer product, 301 | doubtful cases shall be resolved in favor of coverage. For a particular 302 | product received by a particular user, "normally used" refers to a 303 | typical or common use of that class of product, regardless of the status 304 | of the particular user or of the way in which the particular user 305 | actually uses, or expects or is expected to use, the product. A product 306 | is a consumer product regardless of whether the product has substantial 307 | commercial, industrial or non-consumer uses, unless such uses represent 308 | the only significant mode of use of the product. 309 | 310 | "Installation Information" for a User Product means any methods, 311 | procedures, authorization keys, or other information required to install 312 | and execute modified versions of a covered work in that User Product from 313 | a modified version of its Corresponding Source. The information must 314 | suffice to ensure that the continued functioning of the modified object 315 | code is in no case prevented or interfered with solely because 316 | modification has been made. 317 | 318 | If you convey an object code work under this section in, or with, or 319 | specifically for use in, a User Product, and the conveying occurs as 320 | part of a transaction in which the right of possession and use of the 321 | User Product is transferred to the recipient in perpetuity or for a 322 | fixed term (regardless of how the transaction is characterized), the 323 | Corresponding Source conveyed under this section must be accompanied 324 | by the Installation Information. But this requirement does not apply 325 | if neither you nor any third party retains the ability to install 326 | modified object code on the User Product (for example, the work has 327 | been installed in ROM). 328 | 329 | The requirement to provide Installation Information does not include a 330 | requirement to continue to provide support service, warranty, or updates 331 | for a work that has been modified or installed by the recipient, or for 332 | the User Product in which it has been modified or installed. Access to a 333 | network may be denied when the modification itself materially and 334 | adversely affects the operation of the network or violates the rules and 335 | protocols for communication across the network. 336 | 337 | Corresponding Source conveyed, and Installation Information provided, 338 | in accord with this section must be in a format that is publicly 339 | documented (and with an implementation available to the public in 340 | source code form), and must require no special password or key for 341 | unpacking, reading or copying. 342 | 343 | 7. Additional Terms. 344 | 345 | "Additional permissions" are terms that supplement the terms of this 346 | License by making exceptions from one or more of its conditions. 347 | Additional permissions that are applicable to the entire Program shall 348 | be treated as though they were included in this License, to the extent 349 | that they are valid under applicable law. If additional permissions 350 | apply only to part of the Program, that part may be used separately 351 | under those permissions, but the entire Program remains governed by 352 | this License without regard to the additional permissions. 353 | 354 | When you convey a copy of a covered work, you may at your option 355 | remove any additional permissions from that copy, or from any part of 356 | it. (Additional permissions may be written to require their own 357 | removal in certain cases when you modify the work.) You may place 358 | additional permissions on material, added by you to a covered work, 359 | for which you have or can give appropriate copyright permission. 360 | 361 | Notwithstanding any other provision of this License, for material you 362 | add to a covered work, you may (if authorized by the copyright holders of 363 | that material) supplement the terms of this License with terms: 364 | 365 | a) Disclaiming warranty or limiting liability differently from the 366 | terms of sections 15 and 16 of this License; or 367 | 368 | b) Requiring preservation of specified reasonable legal notices or 369 | author attributions in that material or in the Appropriate Legal 370 | Notices displayed by works containing it; or 371 | 372 | c) Prohibiting misrepresentation of the origin of that material, or 373 | requiring that modified versions of such material be marked in 374 | reasonable ways as different from the original version; or 375 | 376 | d) Limiting the use for publicity purposes of names of licensors or 377 | authors of the material; or 378 | 379 | e) Declining to grant rights under trademark law for use of some 380 | trade names, trademarks, or service marks; or 381 | 382 | f) Requiring indemnification of licensors and authors of that 383 | material by anyone who conveys the material (or modified versions of 384 | it) with contractual assumptions of liability to the recipient, for 385 | any liability that these contractual assumptions directly impose on 386 | those licensors and authors. 387 | 388 | All other non-permissive additional terms are considered "further 389 | restrictions" within the meaning of section 10. If the Program as you 390 | received it, or any part of it, contains a notice stating that it is 391 | governed by this License along with a term that is a further 392 | restriction, you may remove that term. If a license document contains 393 | a further restriction but permits relicensing or conveying under this 394 | License, you may add to a covered work material governed by the terms 395 | of that license document, provided that the further restriction does 396 | not survive such relicensing or conveying. 397 | 398 | If you add terms to a covered work in accord with this section, you 399 | must place, in the relevant source files, a statement of the 400 | additional terms that apply to those files, or a notice indicating 401 | where to find the applicable terms. 402 | 403 | Additional terms, permissive or non-permissive, may be stated in the 404 | form of a separately written license, or stated as exceptions; 405 | the above requirements apply either way. 406 | 407 | 8. Termination. 408 | 409 | You may not propagate or modify a covered work except as expressly 410 | provided under this License. Any attempt otherwise to propagate or 411 | modify it is void, and will automatically terminate your rights under 412 | this License (including any patent licenses granted under the third 413 | paragraph of section 11). 414 | 415 | However, if you cease all violation of this License, then your 416 | license from a particular copyright holder is reinstated (a) 417 | provisionally, unless and until the copyright holder explicitly and 418 | finally terminates your license, and (b) permanently, if the copyright 419 | holder fails to notify you of the violation by some reasonable means 420 | prior to 60 days after the cessation. 421 | 422 | Moreover, your license from a particular copyright holder is 423 | reinstated permanently if the copyright holder notifies you of the 424 | violation by some reasonable means, this is the first time you have 425 | received notice of violation of this License (for any work) from that 426 | copyright holder, and you cure the violation prior to 30 days after 427 | your receipt of the notice. 428 | 429 | Termination of your rights under this section does not terminate the 430 | licenses of parties who have received copies or rights from you under 431 | this License. If your rights have been terminated and not permanently 432 | reinstated, you do not qualify to receive new licenses for the same 433 | material under section 10. 434 | 435 | 9. Acceptance Not Required for Having Copies. 436 | 437 | You are not required to accept this License in order to receive or 438 | run a copy of the Program. Ancillary propagation of a covered work 439 | occurring solely as a consequence of using peer-to-peer transmission 440 | to receive a copy likewise does not require acceptance. However, 441 | nothing other than this License grants you permission to propagate or 442 | modify any covered work. These actions infringe copyright if you do 443 | not accept this License. Therefore, by modifying or propagating a 444 | covered work, you indicate your acceptance of this License to do so. 445 | 446 | 10. Automatic Licensing of Downstream Recipients. 447 | 448 | Each time you convey a covered work, the recipient automatically 449 | receives a license from the original licensors, to run, modify and 450 | propagate that work, subject to this License. You are not responsible 451 | for enforcing compliance by third parties with this License. 452 | 453 | An "entity transaction" is a transaction transferring control of an 454 | organization, or substantially all assets of one, or subdividing an 455 | organization, or merging organizations. If propagation of a covered 456 | work results from an entity transaction, each party to that 457 | transaction who receives a copy of the work also receives whatever 458 | licenses to the work the party's predecessor in interest had or could 459 | give under the previous paragraph, plus a right to possession of the 460 | Corresponding Source of the work from the predecessor in interest, if 461 | the predecessor has it or can get it with reasonable efforts. 462 | 463 | You may not impose any further restrictions on the exercise of the 464 | rights granted or affirmed under this License. For example, you may 465 | not impose a license fee, royalty, or other charge for exercise of 466 | rights granted under this License, and you may not initiate litigation 467 | (including a cross-claim or counterclaim in a lawsuit) alleging that 468 | any patent claim is infringed by making, using, selling, offering for 469 | sale, or importing the Program or any portion of it. 470 | 471 | 11. Patents. 472 | 473 | A "contributor" is a copyright holder who authorizes use under this 474 | License of the Program or a work on which the Program is based. The 475 | work thus licensed is called the contributor's "contributor version". 476 | 477 | A contributor's "essential patent claims" are all patent claims 478 | owned or controlled by the contributor, whether already acquired or 479 | hereafter acquired, that would be infringed by some manner, permitted 480 | by this License, of making, using, or selling its contributor version, 481 | but do not include claims that would be infringed only as a 482 | consequence of further modification of the contributor version. For 483 | purposes of this definition, "control" includes the right to grant 484 | patent sublicenses in a manner consistent with the requirements of 485 | this License. 486 | 487 | Each contributor grants you a non-exclusive, worldwide, royalty-free 488 | patent license under the contributor's essential patent claims, to 489 | make, use, sell, offer for sale, import and otherwise run, modify and 490 | propagate the contents of its contributor version. 491 | 492 | In the following three paragraphs, a "patent license" is any express 493 | agreement or commitment, however denominated, not to enforce a patent 494 | (such as an express permission to practice a patent or covenant not to 495 | sue for patent infringement). To "grant" such a patent license to a 496 | party means to make such an agreement or commitment not to enforce a 497 | patent against the party. 498 | 499 | If you convey a covered work, knowingly relying on a patent license, 500 | and the Corresponding Source of the work is not available for anyone 501 | to copy, free of charge and under the terms of this License, through a 502 | publicly available network server or other readily accessible means, 503 | then you must either (1) cause the Corresponding Source to be so 504 | available, or (2) arrange to deprive yourself of the benefit of the 505 | patent license for this particular work, or (3) arrange, in a manner 506 | consistent with the requirements of this License, to extend the patent 507 | license to downstream recipients. "Knowingly relying" means you have 508 | actual knowledge that, but for the patent license, your conveying the 509 | covered work in a country, or your recipient's use of the covered work 510 | in a country, would infringe one or more identifiable patents in that 511 | country that you have reason to believe are valid. 512 | 513 | If, pursuant to or in connection with a single transaction or 514 | arrangement, you convey, or propagate by procuring conveyance of, a 515 | covered work, and grant a patent license to some of the parties 516 | receiving the covered work authorizing them to use, propagate, modify 517 | or convey a specific copy of the covered work, then the patent license 518 | you grant is automatically extended to all recipients of the covered 519 | work and works based on it. 520 | 521 | A patent license is "discriminatory" if it does not include within 522 | the scope of its coverage, prohibits the exercise of, or is 523 | conditioned on the non-exercise of one or more of the rights that are 524 | specifically granted under this License. You may not convey a covered 525 | work if you are a party to an arrangement with a third party that is 526 | in the business of distributing software, under which you make payment 527 | to the third party based on the extent of your activity of conveying 528 | the work, and under which the third party grants, to any of the 529 | parties who would receive the covered work from you, a discriminatory 530 | patent license (a) in connection with copies of the covered work 531 | conveyed by you (or copies made from those copies), or (b) primarily 532 | for and in connection with specific products or compilations that 533 | contain the covered work, unless you entered into that arrangement, 534 | or that patent license was granted, prior to 28 March 2007. 535 | 536 | Nothing in this License shall be construed as excluding or limiting 537 | any implied license or other defenses to infringement that may 538 | otherwise be available to you under applicable patent law. 539 | 540 | 12. No Surrender of Others' Freedom. 541 | 542 | If conditions are imposed on you (whether by court order, agreement or 543 | otherwise) that contradict the conditions of this License, they do not 544 | excuse you from the conditions of this License. If you cannot convey a 545 | covered work so as to satisfy simultaneously your obligations under this 546 | License and any other pertinent obligations, then as a consequence you may 547 | not convey it at all. For example, if you agree to terms that obligate you 548 | to collect a royalty for further conveying from those to whom you convey 549 | the Program, the only way you could satisfy both those terms and this 550 | License would be to refrain entirely from conveying the Program. 551 | 552 | 13. Use with the GNU Affero General Public License. 553 | 554 | Notwithstanding any other provision of this License, you have 555 | permission to link or combine any covered work with a work licensed 556 | under version 3 of the GNU Affero General Public License into a single 557 | combined work, and to convey the resulting work. The terms of this 558 | License will continue to apply to the part which is the covered work, 559 | but the special requirements of the GNU Affero General Public License, 560 | section 13, concerning interaction through a network will apply to the 561 | combination as such. 562 | 563 | 14. Revised Versions of this License. 564 | 565 | The Free Software Foundation may publish revised and/or new versions of 566 | the GNU General Public License from time to time. Such new versions will 567 | be similar in spirit to the present version, but may differ in detail to 568 | address new problems or concerns. 569 | 570 | Each version is given a distinguishing version number. If the 571 | Program specifies that a certain numbered version of the GNU General 572 | Public License "or any later version" applies to it, you have the 573 | option of following the terms and conditions either of that numbered 574 | version or of any later version published by the Free Software 575 | Foundation. If the Program does not specify a version number of the 576 | GNU General Public License, you may choose any version ever published 577 | by the Free Software Foundation. 578 | 579 | If the Program specifies that a proxy can decide which future 580 | versions of the GNU General Public License can be used, that proxy's 581 | public statement of acceptance of a version permanently authorizes you 582 | to choose that version for the Program. 583 | 584 | Later license versions may give you additional or different 585 | permissions. However, no additional obligations are imposed on any 586 | author or copyright holder as a result of your choosing to follow a 587 | later version. 588 | 589 | 15. Disclaimer of Warranty. 590 | 591 | THERE IS NO WARRANTY FOR THE PROGRAM, TO THE EXTENT PERMITTED BY 592 | APPLICABLE LAW. EXCEPT WHEN OTHERWISE STATED IN WRITING THE COPYRIGHT 593 | HOLDERS AND/OR OTHER PARTIES PROVIDE THE PROGRAM "AS IS" WITHOUT WARRANTY 594 | OF ANY KIND, EITHER EXPRESSED OR IMPLIED, INCLUDING, BUT NOT LIMITED TO, 595 | THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR 596 | PURPOSE. THE ENTIRE RISK AS TO THE QUALITY AND PERFORMANCE OF THE PROGRAM 597 | IS WITH YOU. SHOULD THE PROGRAM PROVE DEFECTIVE, YOU ASSUME THE COST OF 598 | ALL NECESSARY SERVICING, REPAIR OR CORRECTION. 599 | 600 | 16. Limitation of Liability. 601 | 602 | IN NO EVENT UNLESS REQUIRED BY APPLICABLE LAW OR AGREED TO IN WRITING 603 | WILL ANY COPYRIGHT HOLDER, OR ANY OTHER PARTY WHO MODIFIES AND/OR CONVEYS 604 | THE PROGRAM AS PERMITTED ABOVE, BE LIABLE TO YOU FOR DAMAGES, INCLUDING ANY 605 | GENERAL, SPECIAL, INCIDENTAL OR CONSEQUENTIAL DAMAGES ARISING OUT OF THE 606 | USE OR INABILITY TO USE THE PROGRAM (INCLUDING BUT NOT LIMITED TO LOSS OF 607 | DATA OR DATA BEING RENDERED INACCURATE OR LOSSES SUSTAINED BY YOU OR THIRD 608 | PARTIES OR A FAILURE OF THE PROGRAM TO OPERATE WITH ANY OTHER PROGRAMS), 609 | EVEN IF SUCH HOLDER OR OTHER PARTY HAS BEEN ADVISED OF THE POSSIBILITY OF 610 | SUCH DAMAGES. 611 | 612 | 17. Interpretation of Sections 15 and 16. 613 | 614 | If the disclaimer of warranty and limitation of liability provided 615 | above cannot be given local legal effect according to their terms, 616 | reviewing courts shall apply local law that most closely approximates 617 | an absolute waiver of all civil liability in connection with the 618 | Program, unless a warranty or assumption of liability accompanies a 619 | copy of the Program in return for a fee. 620 | 621 | END OF TERMS AND CONDITIONS 622 | 623 | How to Apply These Terms to Your New Programs 624 | 625 | If you develop a new program, and you want it to be of the greatest 626 | possible use to the public, the best way to achieve this is to make it 627 | free software which everyone can redistribute and change under these terms. 628 | 629 | To do so, attach the following notices to the program. It is safest 630 | to attach them to the start of each source file to most effectively 631 | state the exclusion of warranty; and each file should have at least 632 | the "copyright" line and a pointer to where the full notice is found. 633 | 634 | 635 | Copyright (C) 636 | 637 | This program is free software: you can redistribute it and/or modify 638 | it under the terms of the GNU General Public License as published by 639 | the Free Software Foundation, either version 3 of the License, or 640 | (at your option) any later version. 641 | 642 | This program is distributed in the hope that it will be useful, 643 | but WITHOUT ANY WARRANTY; without even the implied warranty of 644 | MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the 645 | GNU General Public License for more details. 646 | 647 | You should have received a copy of the GNU General Public License 648 | along with this program. If not, see . 649 | 650 | Also add information on how to contact you by electronic and paper mail. 651 | 652 | If the program does terminal interaction, make it output a short 653 | notice like this when it starts in an interactive mode: 654 | 655 | Copyright (C) 656 | This program comes with ABSOLUTELY NO WARRANTY; for details type `show w'. 657 | This is free software, and you are welcome to redistribute it 658 | under certain conditions; type `show c' for details. 659 | 660 | The hypothetical commands `show w' and `show c' should show the appropriate 661 | parts of the General Public License. Of course, your program's commands 662 | might be different; for a GUI interface, you would use an "about box". 663 | 664 | You should also get your employer (if you work as a programmer) or school, 665 | if any, to sign a "copyright disclaimer" for the program, if necessary. 666 | For more information on this, and how to apply and follow the GNU GPL, see 667 | . 668 | 669 | The GNU General Public License does not permit incorporating your program 670 | into proprietary programs. If your program is a subroutine library, you 671 | may consider it more useful to permit linking proprietary applications with 672 | the library. If this is what you want to do, use the GNU Lesser General 673 | Public License instead of this License. But first, please read 674 | . 675 | 676 | --------------------------------------------------------------------------------